A review of memory errors exploitation in x86-64

Conor Pirry, Hector Marco-Gisbert*, Carolyn Begg

*Corresponding author for this work

Research output: Contribution to journalArticle

31 Downloads (Pure)

Abstract

Memory errors are still a serious threat affecting millions of devices worldwide. Recently, bounty programs have reached a new record, paying up to USD 2.5 million for one single vulnerability in Android and up to USD 2 million for Apple’s operating system. In almost all cases, it is common to exploit memory errors in one or more stages to fully compromise those devices. In this paper, we review and discuss the importance of memory error vulnerabilities, and more specifically stack buffer overflows to provide a full view of how memory errors are exploited. We identify the root causes that make those attacks possible on modern x86-64 architecture in the presence of modern protection techniques. We have analyzed how unsafe library functions are prone to buffer overflows, revealing that although there are secure versions of those functions, they are not actually preventing buffer overflows from happening. Using secure functions does not result in software free from vulnerabilities and it requires developers to be security-aware. To overcome this problem, we discuss the three main security protection techniques present in all modern operating system; the non-eXecutable bit (NX), the Stack Smashing Protector (SSP) and the Address Space Layout Randomization (ASLR). After discussing their effectiveness, we conclude that although they provide a strong level of protection against classical exploitation techniques, modern attacks can bypass them.
Original languageEnglish
Article number48
Number of pages21
JournalComputers
Volume9
Issue number2
DOIs
Publication statusPublished - 8 Jun 2020

Keywords

  • memory errors
  • X86-64
  • stack buffer overflows
  • SSP
  • ASLR
  • NX

Fingerprint Dive into the research topics of 'A review of memory errors exploitation in x86-64'. Together they form a unique fingerprint.

  • Cite this