- The maximum number of local variables in a method is limited to 65,535.
- The maximum number of fields in a class or interface is limited to 65,535.
- The maximum number of methods in a class or interface is limited to 65,535.
- The maximum number of direct superinterfaces of a class or interface is limited to 65,535.
- The maximum number of method parameters is limited to 255.
What's the point of the list above? Sure, those numbers are reduced by various factors (like this reduces the number of method parameters by 1). But, that is not the point, my friends. The point is all of the above numbers are HUGE! I was reading my trusty copy of "The Java Virtual Machine Specification Second Edition" when I got to Section 4.10 and read some of the above bullet points. In Java, I live in the land of the plenty.
I know it's all two byte boundaries and all, but what if they left some of these at one byte or 4 bits? I can hear it now: "But, we have all the space in the world! Why constrain ourselves in such a way!" True. But, when was the last time you looked at a method with 80 parameters and thought: "DAMN! Now, that's some beautiful code." Exactly. You never have and you never will. The point is not to constrain on a byte boundary, but on a good coding boundary.
Arbitrary numbers are awful to use as metrics (like you must have less than 3 arguments to each method and so on). Metrics are useful in relation to something else. You pick what seems reasonable to you (which might not be reasonable for someone else). There is a boundary where any reasonable programmer starts to hurl at bad code though. Metrics are set below that for each team. I hope I never see the interface with 16,000 superinterfaces. I really do. The cool thing is we have a huge library in Java to look at to see what is reasonable. We have tons of open source projects to feed our reason as well. What if...What if we picked what was reasonable from these code bases by using averages and everything else in our numerical power? Then, we doubled it to make it unreasonable. I wonder what the numbers would look like?
My next experiment is to run some code reflecting over both Java, Ruby, and Smalltalk code. I wonder what the averages for each would be (number of method parameters, number of methods per class, number of fields, and number of local variables). I'm curious. I have a guess at what the numbers will be. But, the raw numbers could be telling. I would like to see them for all three of the above languages. It should be interesting. I know there's probably someone out there that has done the same thing, but I want to run the tests on my own. For now, I will do it with the base libraries of all three. My guess is that they will be roughly around the same for each of the metrics that are comparable in each of the languages.
What's the point? By reading and then thinking of the worst, it got me thinking. Our languages constrain us in ways that they shouldn't, but don't in what would make our code better. Would it be such a bad thing having to split a method up because you ran over a rather large (let's say 256) number of local variables? I want to find the number where it seems unreasonable and grotesque and make that my limit. I know 65,535 is absurd for the number of local variables, but so is 256. What about 16? My answer is "HELL YES IT IS!", but there are those that would argue that sometime you need that many. 16 is not hideous, but is still in poor taste.
I'm off to see the wizard. I'll let you know what he says.
What about if you are generating the code, and it doesn't matter what it looks like anymore -- humans don't need to see it.
Do those numbers still appear unreasonably high?
Those numbers still seem high even if the computer is generating them. But, if a human is not going to be looking, then I would care less.
I was mainly commenting on how languages have a high set of constraints that they allow.
Certainly I agree that the 65k numbers are high.
I'm not sure about the 255 for method params though, because one of my professors was telling me about dealing with 500 columns at once - perhaps generating methods in Java to deal with such a monster table isn't out of the question then.
Even with some statistical programs I cannot imagine needing 65k variables... but perhaps one day when we have the computational power to deal with it? I don't know - it's just a thought, however unlikely it may be.
Post a Comment