No, I was absolutely serious.
It was a blast from the past for me when you posted #85, and I feel REDEFINES deserves a mention.
Not sure you can do the same in VBA?
O very much so! I understand that while GoTo's are generally bad, they do have their purpose and should be used only sparingly. The rest of the conversation was quite educational, even though a lot of the terms being thrown around required a google search to keep remotely following the conversation
No, I was absolutely serious.
It was a blast from the past for me when you posted #85, and I feel REDEFINES deserves a mention.
Not sure you can do the same in VBA?
Well, I don't know if COBOL is identical to DIBOL/DBL but the latter had two number formats.
One was a character decimal, where you defined a number as say D8.2 where you had an 8 figure number with 2 implied dps, The characters were stored as 00345678, but the compiler knew this number was 3456.78. Negatives were shown by a letter at the end.
eg 0034567t where t represented -5 (I think), and therefore this number was -3456.75. You didn't have to specify the dps. You could just assert the dps when you used the number, but you had to be consistent for it to work in this way.
So you could redefine D8.2 as A8 and just treat it as 8 characters. You could change the number characters to produce a different number , and you couldn't store a non-numeric in the numeric field.
But you could also define a field as i4, and now you had a 4 byte long which can store the normal range of longs - but now although you can redefine the i4 as a A4, it's not really meaningful, and hard to change the number by changing one character in the number.
Those computers that could support it used any of 3 (or 4) different formats. You could count actual numbers with either binary integers or with binary real (floating point). You could count numbers using ASCII strings (a datatype called "number strings" to differentiate from "numbers." AND for those that could handle it, BCD numbers, in which each 4-bit quantity, one digit for each "nibble" of a BCD number (a nibble = 1/2 byte) The digits for 0-9 were actual digits. The other six possible nibbles were specialty flags. One was a minus sign. The others were something else.
IBM's 360 architecture supported BCD, as did Digital, Motorola (mostly in micro chips), Intel, TI, and a few other chip makers.