The new information was then used to create a piece of hardware that helped the FBI to crack the iPhone's four-digit personal identification number without triggering a security feature that would have erased all the data, the individuals said.
The researchers, who typically keep a low profile, specialize in hunting for vulnerabilities in software and then in some cases selling them to the US government. They were paid a one-time flat fee for the solution.
Cracking the four-digit PIN, which the FBI had estimated would take 26 minutes, was not the hard part for the bureau. The challenge from the beginning was disabling a feature on the phone that wipes data stored on the device after 10 incorrect tries at guessing the code. A second feature also steadily increases the time allowed between attempts.
The bureau in this case did not need the services of the Israeli firm Cellebrite, as some earlier reports had suggested, people familiar with the matter said.
The US government now has to weigh whether to disclose the flaws to Apple, a decision that probably will be made by a White House-led group.
(Also see: Apple Remains in Dark on How FBI Hacked iPhone Without Help)
The people who helped the US government come from the sometimes shadowy world of hackers and security researchers who profit from finding flaws in companies' software or systems.
Some hackers, known as "white hats," disclose the vulnerabilities to the firms responsible for the software or to the public so they can be fixed and are generally regarded as ethical. Others, called "black hats," use the information to hack networks and steal people's personal information.
At least one of the people who helped the FBI in the San Bernardino, California, case falls into a third category, often considered ethically murky: researchers who sell flaws to governments, companies that make surveillance tools or groups on the black market.
This last group, dubbed "gray hats," can be controversial, because critics say they might be helping governments spy on their own citizens. Their tools, however, might also be used to track terrorists or hack an adversary spying on the United States. When selling exploits to governments or on the black market, these researchers do not disclose the flaws to the companies responsible for the software, as the exploits' value depends on the software remaining vulnerable.
In the case of the San Bernardino iPhone, the solution brought to the bureau has limited shelf life.
FBI Director James B. Comey has said that the solution works only on iPhone 5c's running the iOS 9 operating system - what he calls a "narrow slice" of phones.
Apple said last week that it would not sue the government to gain access to the San Bernardino solution.
Still, many security and privacy experts have been calling on the government to disclose the vulnerability data to Apple so that the firm can patch it.
If the government shares data on the flaws with Apple, "they're going to fix it and then we're back where we started from," Comey said in a discussion at a privacy conference last week. Nonetheless, he said Monday in Miami, "we're considering whether to make that disclosure or not."
The White House has established a process in which federal officials weigh whether to disclose any security vulnerabilities they find. It could be weeks before the FBI's case is reviewed, officials said.
"When we discover these vulnerabilities, there's a very strong bias towards disclosure," White House cyber-security coordinator Michael Daniel said in an interview in October 2014, speaking generally and not about the Apple case. "That's for a good reason. If you had to pick the economy and the government that is most dependent on a digital infrastructure, that would be the United States."
But, he added, "we do have an intelligence and national security mission that we have to carry out. That is a factor that we weigh in making our decisions."
The decision-makers, which include senior officials from the Justice Department, FBI, National Security Agency, CIA, State Department and Department of Homeland Security, consider how widely used the software in question is. They also look at the utility of the flaw that has been discovered. Can it be used to track members of a terrorist group, to prevent a cyberattack, to identify a nuclear weapons proliferator? Is there another way to obtain the information?
In the case of the phone used by the San Bernardino terrorist, "you could make the justification on both national security and on law enforcement grounds because of the potential use by terrorists and other national security concerns," said a senior administration official, speaking on the condition of anonymity because of the matter's sensitivity.
A decision also can be made to disclose the flaw - just not right away. An agency might say it needs the vulnerability for only a few months or that its utility will quickly diminish.
"A decision to withhold a vulnerability is not a forever decision," Daniel said in the earlier interview. "We require periodic reviews. So if the conditions change, if what was originally a true [undiscovered flaw] suddenly becomes identified, we can make the decision to disclose it at that point."
© 2016 The Washington Post
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.