Skip links

Analyzing Popular Smart Contracts with Slither-Analyzer Tool

Summary

We analyzed 15 smart contracts from 3 popular Ethereum projects using the slither-analyzer tool by Trail of Bits. Over 270 detections were found, of which 10 were high-impact and high-confidence and other 17 were high-impact and medium-confidence. Taking into account that these contracts are deployed on the Ethereum mainnet, that they have not been hacked so far, and that they handle significant amounts of money in assets, we consider these detections to be false positives.

We conclude that a Slither detection by itself does not necessarily mean that a contract is insecure. This implies that, while useful, Slither does not replace an audit by a security specialist.

Feel free to contact our smart contract auditing service for a full security assessment of your smart contracts.

Introduction

In this article we study the vulnerabilities detected by Slither on popular smart contracts currently deployed and active on the Ethereum network. For this analysis, we focus on deployments made by some of these major DeFi projects: 1inch, Aave and Uniswap.

Considering the security audits that these projects have undergone in the last few years, as well as their frequent exposure to large numbers of users, we conduct our analysis assuming that these smart contracts are free of vulnerabilities. Under this assumption, our objective with this article is to understand the frequency and severity of false positives obtained with Slither.

Analysis

Analyzed results were classified according to their impact and confidence categories. We observe that, out of 270 potential vulnerabilities detected, 10 (~3.5%) were both high-impact and high-confidence, and other 17 detections were high-impact and medium-confidence.

Furthermore, we observe that these 27 high-impact detections are distributed among 11 out of the 15 analyzed contracts, which include at least 3 contracts for each of the parent projects analyzed (see Methodology section below for the addresses of the analyzed contracts).

Conclusion

With the analysis of these battle-tested smart contracts we conclude that a manual revision of detections produced by static analysis tools is necessary to rule out false positives. This is a part of the work we do when we perform our smart-contract security audits.

Methodology

For this study, we used the current version of Slither (v0.9.1), together with solc-select (v1.0.2) to automatically switch to the expected version of solc used in the compilation of each smart contract we analyzed. These tools were run on Ubuntu 20.04.4 LTS over the list of 15 deployed smart contracts that we summarize below.

Table 1: Analized Projects

 

Impact \ ConfidenceHighMedium
High
Medium
Low
Informational
Optimization

In order to build this list, we reviewed the main contracts of each parent company and looked for smart contracts satisfying the following criteria:

  • Smart contracts that were verified on etherscan
  • Smart contracts that received transactions within the last month prior to this article

For each smart contract listed above, we used their deployment address to run the command slither <address> --show-ignored-findings --json <analysis_tag>.json, obtaining a .json file with the results of the static analysis performed by Slither. The inline parameter --show-ignored-findings is used to ensure that all uncommented code is analyzed by the tool.

We used the results in these .json files to analyze the detected issues in the Analysis section. For this analysis, we focused on the confidence and impact categorization of detectors provided by Slither, considering only potential security vulnerabilities and discarding detections deemed only as Informational or Optimization.

Table 2: Impact and Confidence classification of detectors considered in our analysis.

 

[table “15” not found /]