Representing a particular date a million years ago strikes me as meaningless. Julian calendar? Should days of week honor the Babylonian system?
Create your own type for this, decide what you actually need to represent.
--- Updated: This was accepted, so I'll add a few more specific bits. ---
As mentioned in another answer, according to the EcmaScript spec, pg 164 of the fifth edition (link is a .pdf.)
Time is measured in ECMAScript in milliseconds since 01 January, 1970
UTC. In time values leap seconds are ignored. It is assumed that
there are exactly 86,400,000 milliseconds per day. ECMAScript Number
values can represent all integers from –9,007,199,254,740,991 to
9,007,199,254,740,991; this range suffices to measure times to
millisecond precision for any instant that is within approximately
285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is
slightly smaller: exactly –100,000,000 days to 100,000,000 days
measured relative to midnight at the beginning of 01 January, 1970
UTC. This gives a range of 8,640,000,000,000,000 milliseconds to
either side of 01 January, 1970 UTC.
But, this is for theoretical dates. It ignores a few pieces of reality. Days were shorter (by 12 seconds) a million years ago, so some JavaScript math would be inaccurate. Days of the week have been determined with different systems. Months have been defined differently. All to say, decide what you really need to represent.