I have some questions about system core clock for hardware initial setting and calculation.
My question happened when I checked the system core clock (CCLK). On my developing board, we use ADSP-21479 (266MHz) the crystal for the CLKIN is 25MHz, and the CLK_CFG1-0 is set as "00". According ADSP-21479's datasheet, the PLLM should be 8:1 (at CLK_CFG1-0=00). But when the board is power on and connect to VisualDSP++ through the USB-ICE Emulator, I found the value PLLM of the PMCTL is 16 (0x10), and INDIV is 0. Do both value make sense at the power on?
In this situation: fCCLK=(2 X PLLM x fINPUT)/PLLD=(2 X 16 X 25)/2=400 MHz (the period is 2.5 nS and so the period of PCLK should be 5nS) that is much more than the DSP's clock rate (266MHz). And then for making confirmation, I use Precision Clock Generator in which I set CLKDIV as 16(0x10), so period of of the SCLK is 16 X PCLK=16 X 5nS=80 nS. I checked the SCLK and its period is 80 nS.
My calculation above is correct? if not, please tell me the wrong please. If it is correct, why I set CLK_CFG1-0 as its lowest value (00) to want to get lowest speed of the clock at the beginning, but its speed is still much faster than the DSP's rate?
Attached is the values of PMCTL and PCG_CTLA, and the wave shapes of CLKIN (crystal) and SCLK where you can see the PLLM is 16 ( though hardware setting is 8), and the period of the CLKIN (pink trace) (from the crystal) is 40 ns (25MHz) and the period of SCLK (yellow trace) (from PCGA) is 80 ns (16 PCLK).
Hope this information could help you answer my questions. Should you need other information, please let me know.