Shiv - it is theoretically possible for Access to manage databases individually up to 2 GB, though if one ever approaches that size, it becomes VERY difficult to use efficiently.
As Colin has pointed out, if you have an aggregate of about 15 Gbytes, you are looking at not less than 8 contributing files and I would recommend not less than 10 to spread them out a bit more. However, if you have a single table that exceeds 2 GBytes in size (e.g. 200 bytes per record x 10,000,000 records) then that table plus any overhead required for it simply will not fit.
At this point, based on your initial statement, you can use Access as a "front-end" (interface) but the raw data (back-end) will PROBABLY need to be on some flavor of server, like SQL Server, ORACLE Server, or a couple of other common possibilities.
As to DLookup for something with 10,000,000 rows, you should know that EACH CALL to a DLookup does what it does by building, executing, and analyzing the results of a query. If you used VBA in a loop, each DLookup would have to search for a record among 10 million rows, and the expectation value (a statistical term) would be an average of 5 million rows visited PER QUERY. But then if you had to do something like that even once per existing row, you would be dealing with 50 million million (= 50,000,000,000,000) touches to the database. Even with indexing, 50 trillion of ANYTHING is a lot.
i do not know how criteria to be defined for all values of a field
I understand. It can be daunting. But I need some clarification because I don't know exactly what you asked. Did you mean (a) You don't understand the fields that are the three arguments of a DLookup? or (b) you have some problem fields in your data rows for which you are not sure how to build the required selection criteria? or (c) something else.