We already mentioned some of the main visible components that serve as the construction blocks for building GUIs in typical GUI libraries, like windows and buttons. The image below shows a few more that we commonly encounter in today’s software applications including group boxes, labels, check boxes, radio buttons, combo boxes, line input fields, text input areas, tab views, and list views. Others that you are probably familiar with are tool bars, tool buttons, menu bars, context menus, status bars, and there are many, many more!
Typically, the GUI library contains classes for each of these visible elements, and they are often referred to as the widgets. Certain widgets can serve as containers for other widgets and, as a result, widgets tend to be organized hierarchically within a concrete graphical interface. For instance, a dialog box widget can contain many other widgets including a tab widget that in turn contains labels and buttons on each of its tab areas. If a widget A directly contains another widget B, we say that B is a child of A and A is B’s parent. A widget without a parent is a window that will be displayed independently on the screen. Widgets can have many different attributes for controlling their visual appearance, their layout behavior, and how they operate. Methods defined in the respective widget class allow for accessing and modifying these attributes. The most common operations performed with widgets in program code are:
We will explain the ideas of layout management and event handling hinted at in the last two bullet points above in more detail in the next sections. From the user's perspective, widgets can be interacted with in many different ways depending on the type of the widget, including the following very common forms of interactions:
In addition, there are complex widgets that allow the user to interact with them and change their state by clicking on particular parts of a widget. Examples are the user unfolding a combo box to select a different value, the user clicking on a menu in the menu bar to open that menu and select an item in it, the user moving the slider component of a widget to adapt some value, or the user selecting a color by clicking on a particular location in a widget with a chromatic circle. The events caused by such user interactions are what drives the order of code execution in the underlying program code as will be further explained in Section 2.4.3 [1].