The number of files may not correspond to the number of different requests that might be cached.
For instance, all my sites use only one index.php file. The content displayed depends on additional parameters in the URI. The "welcome" page is represented as /index.php?go=welcome; the "contact us" page is /index.php?go=contact, and so forth. So my one index.php file can generate literally dozens of different pages.
If you are sure that only one page is generated per file, then you can use
to get a count of all files ending in .php in /var/www and its subdirectories.
ls -lR *.php | wc -l
Otherwise you can use "wget -r http://www.example.com/" to generate a complete list of all URLs accessible from the home page and count those. The "-r" ("recursive") parameter tells wget to follow all the links it finds.