I have seen this “design pattern” (don’t know what else to call it.. template?) show up more than once in Java code. Application is expected to be extended and include a main method. I can’t figure out what the benefit is of using Class instead of the AppFrame in the start method since it is just type casted to AppFrame anyways. It just all seems so pointless, maybe somebody can fill me in.
public class Application {
public static class AppPanel extends JPanel {
//stuff
}
public static class AppFrame extends JFrame {
protected AppPanel mainPanel;
//more stuff
}
public static AppFrame start(Class appFrame) {
try {
final AppFrame frame = (AppFrame) appFrame.newInstance();
java.awt.EventQueue.invokeLater(new Runnable()
{
public void run()
{
frame.setVisible(true);
}
});
return frame;
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
public static void main(String[] args){
Application.start(AppFrame.class);
}
}
I’ve seen that around too, particularly in fairly old examples.
The idea seems to be that everything starts with a single static call, and the user does not create anything on the heap. Instead, the “Application framework” somehow instantiates what it needs and the user only instructs what specific implementations to use. Thus, the user indicates the class to use, but nothing is actually instantiated yet. One risk of doing this is that you could transfer a class whose instances are not convertible to an AppFrame. This can result in an exception.
I personally consider this ugly design. It makes sense in languages where it is more common to directly instantiate a class via a class object like Smalltalk or Python. In Java, I think that sticking with factory interfaces is clunkier yet more OO.