Are you sure you want to do this? There are a lot of packages, and it's unlikely that your users will want all of them.
Naively, you could get a vector of all packages and install them and their dependencies. For some, you will likely need to install both simple (e.g., libcurl-dev, libxml2-dev) and complex system dependencies. At the end of the day you will be able to say
pkgs = available.packages(repos=biocinstallRepos())
A better strategy is likely to create a local mirror of the Bioconductor CRAN-style repository, and then to install / update from there. See the
siteRepos argument to
BiocInstaller::biocLite(), and the
repos argument to
biocLite() rather than
install.packages() for installation, to more correctly manage R / Bioconductor versions).
Probably though you have several users that you are trying to support. Then the strategy is to allow users to install packages into their personal libraries, which is how R works 'out of the box' -- install R system wide; when a user tries to install a package via instructions on any package landing page (e.g., IRanges) they'll be prompted to create their own library in a characteristic location. You could have a core of packages installed centrally, e.g., by running under the account used to install R the commands
and perhaps other packages that you find are used by several users. But in the end I think this kind of 'management' is a dis-service, at least to 'regular' users (maybe it is a benefit to novice users; 'advanced' users will just avoid the system installation). Perhaps there are security policy constraints at your location that don't allow this sort of user independence; investigate changing those policies. There is some guidance in section 6 of the R-admin manual.