The article you linked to was about replacing one (misapplied) architectural pattern with another.
The concept of "data flow" is not well defined, but doesn't matter to the question.
The McCabe formula examines program flow - very informally you could see it as a measure of the number of decisions made. But this is measured by looking at every control statement, so the scale of what it measures is much finer grained than the scale of the architectural changes you propose.
Let's go with "the McCabe formula doesn't measure data flow".
If you make the breadth of changes required to redesign an architecture, then the formula will of course produce a different result. Based on the branching factor of the program, as always.
Suppose you redesign a program so that code - and specifically decisions - are repeated less often. Then the complexity could well go down. You may also judge this to be a better architecture.
But cyclomatic complexity isn't a way of describing anything as large as architecture really. It is a way of looking at small chunks of code; methods, classes. Using it to evaluate architecture is pretty meaningless, as any architecture could be implemented well or badly.