Over the course of the last three years, more than 11,200 defects have been eliminated from over 180 open source projects. The defect reduction comes in part due to the Coverity Scan effort, originally funded by the U.S Department of Homeland Security (DHS) in 2006. DHS no longer funds the effort, but Coverity has continued to operate the scan project on its own.
Coverity has seen an overall 16 percent reduction in the defect density found in the projects it has scanned over the last three years. Yet while the defect density has declined, the most recent Coverity Scan Open Source Report notes that the most common defect types are holding steady. For the last two years, the most common defect type reported by Coverity in its open source scan is something known as a ‘NULLPointer Deference’.
The problem with a NULL pointer deference is that a program is trying to access a memory location that doesn’t exist. As such an application could potentially crash and in some cases be at risk from potential security exploits. The reason why Null Pointers continue to be common, speaks to the need of why there is a continuing need to scan code to eliminate errors.
“The use of pointers is very common and pointers are just difficult to get right for lots of different reasons,” Andy Chou, CTO and co-founder of Coverity told InternetNews.com. “Developers often don’t often keep track carefully of whether or not a point is NULL and as a result they make mistakes, because pointers are so common, the number of chances they have for making a mistake is also very high. So even if they only screw up a small percentage of the time the number of defects may still be large since they have so many opportunities to mess up.”
Chou added that by releasing the scan results it enables developers to see exactly where in the code they are making mistakes. In terms of educating developers, as they start to fix defects and see what they’ve done wrong they tend to change programming habits, and tend not to make the same mistakes going forward. Chou also noted that Coverity’s defect scan results are cumulative over the past three years and as a result, the historical impact of having had many NULL pointer issues in previous years will continue to impact the stats.
In addition to NULL pointer errors, there has been an increase in the frequency of another type of flaw known as an ‘Uninitialized Values Read.’
“Uninitialized values are a common programming defect where developers don’t set values to the variable before they start using them,” Chou explained. “So whatever garbage is leftover in the memory used by those variables will be read and you’ll do incorrect computations as a result.”
Chou added that the defect is common, though until this year was under-reported. Coverity has improved its analyzer to deeper checking for this problem and because of that change more defects were found.
“I don’t think there has been a change in the rate at which the defects appear in the code, it’s more that our ability to detect the defects changed,” Chou said.
That said, Coverity is planning to make further efforts to better inform and alert developers to the severity of the bugs, that they find. David Maxwell, open source strategist at Coverity noted that there will be functionality in the next version of scanning software that will include an impact value. That value will help to alert developers as to the potential impact that a particular bug could have on the overall code.
The Coverity Scan project works by running a scan on code that has already been committed to a project. In 2006, Coverity was granted a three year Department of Homeland Security grant to help improve open source code. That grant expired at the end of 2008. Actually helping developer to have clean code before it formally ends up in a project’s source code is also something that Coverity has an eye on.
Chou noted that Coverity currently has an IDE (define) plug-in for both the open source Eclipse IDE as well as Microsoft’s Visual Studio for some desktop scanning
“Over time we do plan on investing more in that capability so that developers can do deeper checking on the desktop without taking a very long time. That’s one of the challenges, it does take a fair amount of time to do the in-depth analysis.”
Chou added that Coverity is also planning on having some kind of support of other IDEs as well over time.
“We do see that the desktop is an area for major improvements in the static code analysis market,” Chou said. “Clearly developers are interested in having it there because that is how they work.”
Article courtesy of InternetNews.com.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.