The present physiological in vitro tests of Mg degradation follow a standard procedure stated according to ASTM. This standard, although useful in predicting the initial degradation behavior of an alloy, has its limitations in interpreting the same for longer periods of immersion in cell culture media. This has an important consequence as the alloy’s degradation behavior is dependent on time. Even if two alloys show the same degradation rate in a short time experiment, their degradation characteristics and nature of degradation might be different after a certain time interval. Furthermore, many studies concerning Mg degradation extrapolate the corrosion rate from a single time point measurement to the order of a year which might not be appropriate because of time dependent degradation behavior of the alloy. In this work, addressing the above issues, we put forth a new methodology in conducting and assessing the immersion test procedure in determining the corrosion rates of Mg alloys. A model based on simple physical principles has been developed to evaluate the degradation behavior with time by means of analytical expressions using the experimental data as input parameter. For this purpose, long term in vitro physiological degradation experiments (immersion tests) were conducted on cast and extruded Mg-2Ag and powder pressed and sintered Mg-0.3Ca alloy systems. DMEM + Glutamax (Dulbecco’s Modified Eagle’s Medium, (+) 4.5 g/L D-Glucose, (+) Pyruvate) supplemented with 10% FBS (Fetal Bovine Serum) was used as cell culture medium. Additionally, 1% Penicillin Streptomycin has been added to the medium prior to immersion test to prevent bacterial contamination. Degradation rate was calculated through the differences in initial and final masses of each set of alloy samples which belong to a specific intermediate immersion time interval during the long term immersion period. The advantages of such a methodology in predicting the degradation rate in vivo deduced from in vitro experiments are discussed.