Skip to main content

12 posts tagged with "CS"

View All Tags

· 3 min read

Motivation

During my work as a software engineer, I've discovered that certain computer science modules have remained invaluable. These modules have played a significant role in both grasping new concepts and tackling everyday problems at work. Here are a few that frequently come to mind.

CS2030S Programming Methodology II

While Java is not as low-level as C or C++, this module taught me the fundamentals of Java programming. Understanding a statically typed language has been instrumental in grasping other languages, as concepts like types, classes, and objects are universal. Specifically, topics such as Generics, Streams (Functional Programming), and concurrency have proven to be broadly applicable across different programming languages.

CS2103T Software Engineering

This module offered a comprehensive overview of software engineering practices, instilling numerous best practices that I still strive to adhere to in my professional life. Skills in diagramming, testing, and team collaboration (including writing effective commit messages) have been practical and sometimes surpass what is commonly achieved in the industry.

CS2102 Database Systems

Although I've largely forgotten the specifics of BNFs and normalization forms, encountering SQL and databases is inevitable as a software engineer. The knowledge gained from this module proves useful during interviews and instills confidence in schema design. Understanding the CAP theorem is also beneficial when designing distributed systems. Additionally, insights from the NTU module CE/CZ4153 Blockchain Technology are pertinent in this context, especially regarding balancing consistency, availability, and partition tolerance in systems.

CS3282 Thematic Systems Project II

This module essentially equates to managing a small open-source project. The tasks of reviewing PRs and gaining a deep understanding of a complex codebase are invaluable. It emphasizes taking ownership of code and its continuous evolution. This includes navigating through issues, PRs, and discussions to grasp the codebase's context and handling code changes by others. I have gained a profound appreciation for maintenance burdens and code design, recognizing the long-term impacts of design decisions and becoming more cautious with feature introductions and refactorings.

CS4218 Software Testing

Especially relevant for QA roles, this module's focus on unit/integration testing and developing a robust test suite remains pertinent. It covers managing combinatory explosions with a well-chosen subset of test cases—a critical skill for software engineers. Additionally, the project that involved implementing common command-line tools provided a solid foundation in Unix shell basics, including piping and using tools like cat and grep.

Conclusion

While there may be more modules worth mentioning, these are the ones that stand out to me. Reflecting on my academic journey, I realize that even the modules that seemed less immediately useful at the time can become highly relevant later on. Whether it's technical theories or database normalization strategies, such knowledge becomes a valuable resource when facing challenges or making informed decisions in the field.

· 9 min read

This post is inspired by the content from Brandon Rhodes on The Composition Over Inheritance Principle, which I found particularly intriguing. My aim in this post is to review and rewrite the code examples from my perspective.

Motivation

The debate between inheritance and composition is a longstanding one. While both approaches offer ways to create useful abstractions, they come with their unique challenges and benefits. The distinction, however, sometimes introduces complexities rather than simplifying our design.

A prime example of this complexity is what I refer to as the "2 x 2 problem." This issue arises with inheritance when we encounter scenarios requiring multiple functionalities or behaviors, each of which can have multiple extensions.

Consider the case of a logging system, as illustrated in the referenced article. A logger typically needs to manage two key aspects: filtering and output destination. But what happens when we require multiple filtering methods and various output destinations? The conventional inheritance-based implementation of a Logger class, as we will see, may not be the most maintainable solution:

class Logger:
def __init__(self):
self.content = []

def log(self, message):
self.content.append(message)
return message

# Filtering
class LevelFilteredLogger(Logger):
def __init__(self, level):
self.level = level
super().__init__()

def log(self, message):
if self.level in message:
return super().log(message)

class LengthFilteredLogger(Logger):
def __init__(self, length_limit):
self.length_limit = length_limit
super().__init__()

def log(self, message):
if len(message) < self.length_limit:
return super().log(message)

# Log Destination
class STDOutLogger(Logger):
def log(self, message):
print(message)
return super().log(message)

class FileLogger(Logger):
def __init__(self, filename):
self.filename = filename
super().__init__()

def log(self, message):
with open(self.filename, 'a') as f:
f.write(message + '\n')
return super().log(message)

The challenge arises when attempting to create combinations from these two dimensions. A LevelFilteredFileLogger example demonstrates this complexity:

# Combination of Filtering and Log Destination
class LevelFilteredFileLogger(LevelFilteredLogger):
def __init__(self, level, filename):
self.filename = filename
super().__init__(level)
def log(self, message):
msg = super().log(message)
if msg:
with open(self.filename, 'a') as f:
f.write(msg + '\n')
return msg

if __name__ == '__main__':
logger = LevelFilteredFileLogger('ERROR', 'log.txt')
logger.log('DEBUG: debug message')
logger.log('INFO: info message')
logger.log('ERROR: error message')

# Expected output in log.txt:
# ERROR: error message

Creating more combinations quickly becomes cumbersome without resorting to multiple inheritance. Additionally, this method of combining functionalities may sometimes fail to maintain the intended behavior.

A solution to this problem will be discussed later on. But first, let us explore a related issue and a possible approach through the adapter pattern.

Adapter Pattern

The adapter pattern proves particularly useful in scenarios where the existing abstraction cannot be modified, often encountered in legacy systems. Here, the original developer may have defined specific "filtering" and "output" functionalities. Subsequently, a requirement emerges to introduce a new functionality, necessitating a combination of this new feature with the existing ones. To illustrate this challenge more effectively, consider the following adjusted code example:

class FileHandler:
def __init__(self, filename):
self.filename = filename

def write(self, message):
with open(self.filename, 'a') as f:
f.write(message + '\n')

class FileLogger:
def __init__(self, file_handler):
self.file_handler = file_handler

def log(self, message):
self.file_handler.write(message)

class LevelFilteredLogger(FileLogger):
def __init__(self, file_handler, level):
self.level = level
super().__init__(file_handler)

def log(self, message):
if self.level in message:
super().log(message)

This scenario depicts a development path where initially, the focus was on supporting file-based logging, followed by the introduction of filtering capabilities. This approach might seem adequate initially. However, the complexity resurfaces with the need to incorporate another filtering method.

To address this using the adapter pattern, the strategy involves:

  • First, create a STDOutHandler dedicated to outputting logs to the standard output.
  • Subsequently, a STDOutAdapter is developed to bridge the STDOutHandler with the FileHandler interface expected by the FileLogger. This is crucial because, at this juncture, the FileLogger lacks the capability to interact with a STDOutHandler, and modifications to the FileLogger are undesirable.
  • The adapter effectively facilitates the substitution of FileHandler with any alternative that conforms to the required interface.
  • By employing the STDOutAdapter, it becomes feasible to integrate it within the LevelFilteredLogger.
    • Here, the STDOutAdapter is created to comply with the FileHandler interface, encapsulating a STDOutHandler. It delegates the execution of the write function to the STDOutHandler.

The code illustrating the implementation of these additional classes is as follows:

class STDOutHandler:
def log(self, message):
print(message)

class STDOutAdapter(FileHandler):
def __init__(self):
self.STDOutLogger = STDOutHandler()
super().__init__(None)

def write(self, message):
self.STDOutLogger.log(message)

class LengthFilteredLogger(FileLogger):
def __init__(self, file_handler, length_limit):
self.length_limit = length_limit
super().__init__(file_handler)

def log(self, message):
if len(message) < self.length_limit:
super().log(message)


if __name__ == '__main__':
file_handler = FileHandler('log.txt')
file_logger = FileLogger(file_handler)
file_logger.log('Hello World')

std_out_handler = STDOutAdapter()
std_out_logger = FileLogger(std_out_handler)
std_out_logger.log('Hello World')

level_filtered_file_logger = LevelFilteredLogger(std_out_handler, 'INFO')
level_filtered_file_logger.log('INFO: Hello World')
level_filtered_file_logger.log('DEBUG: Hello World')

length_filtered_std_logger = LengthFilteredLogger(std_out_handler, 10)
length_filtered_std_logger.log('Hello World')
length_filtered_std_logger.log('Hello')

# Expected output in log.txt:
# Hello World

# Expected output in console:
# Hello World
# INFO: Hello World
# Hello

This approach presents a viable solution when modifications to the existing codebase are not an option. By leveraging the adapter pattern, we've effectively minimized the necessity for multiple classes, enabling the straightforward replacement of one handler with another.

Bridge Pattern

Building on the previous discussion, the bridge pattern also emerges as an effective strategy to address the outlined challenges. This pattern comes into play when there are no constraints against modifying the original source code. Unlike the adapter pattern, which necessitates conforming to an existing, possibly ill-suited interface due to unanticipated extensions, the bridge pattern advocates for establishing necessary abstractions upfront.

In contrast to the somewhat cumbersome adaptation required to fit the FileHandler interface in the adapter pattern scenario, the bridge pattern would have us define a Handler class at the outset. This foundational class would then serve as a base for both STDOutHandler and FileHandler, each implementing the Handler class's interface methods. This design allows for the straightforward addition of new handler types, ensuring the system's extensibility and maintainability.

import abc

class Handler:
@abc.abstractmethod
def write(self, message):
pass

class STDOutHandler:
def write(self, message):
print(message)

class FileHandler:
def __init__(self, filename):
self.filename = filename

def write(self, message):
with open(self.filename, 'a') as f:
f.write(message + '\n')

With the bridge pattern's application, the Logger is designed to accommodate any Handler, thereby significantly simplifying the process of extending logging destinations. The structure of the Logger class, adhering to this pattern, would be as follows:

class Logger:
def __init__(self, handler):
self.handler = handler

def log(self, message):
self.handler.write(message)

The LevelFilteredLogger and LengthFilteredLogger would look like this:

class LevelFilteredLogger(Logger):
def __init__(self, handler, level):
self.level = level
super().__init__(handler)

def log(self, message):
if self.level in message:
super().log(message)

class LengthFilteredLogger(Logger):
def __init__(self, handler, length_limit):
self.length_limit = length_limit
super().__init__(handler)

def log(self, message):
if len(message) < self.length_limit:
super().log(message)

if __name__ == '__main__':
file_handler = FileHandler('log.txt')
file_logger = Logger(file_handler)
file_logger.log('Hello World')

std_out_handler = STDOutHandler()
std_out_logger = Logger(std_out_handler)
std_out_logger.log('Hello World')

level_filtered_std_logger = LevelFilteredLogger(std_out_handler, 'INFO')
level_filtered_std_logger.log('INFO: Hello World')
level_filtered_std_logger.log('DEBUG: Hello World')

length_filtered_file_logger = LengthFilteredLogger(file_handler, 10)
length_filtered_file_logger.log('Hello World')
length_filtered_file_logger.log('Hello')

# Expected output in log.txt:
# Hello World
# Hello

# Expected output in console:
# Hello World
# INFO: Hello World

Decorator Pattern

The decorator pattern also offers a unique perspective on addressing this issue. It encourages the definition of classes that maintain identical signatures, enabling their encapsulation within one another to facilitate stackable behavior. In the context of loggers, this approach begins with the establishment of destination-based logging classes:

class STDOutLogger:
def log(self, message):
print(message)

class FileLogger:
def __init__(self, filename):
self.filename = filename

def log(self, message):
with open(self.filename, 'a') as f:
f.write(message + '\n')

Then, we define the filtering-based logging classes to contain destination-based loggers:

class LevelFilteredLogger:
def __init__(self, logger, level):
self.logger = logger
self.level = level

def log(self, message):
if self.level in message:
self.logger.log(message)

class LengthFilteredLogger:
def __init__(self, logger, length_limit):
self.logger = logger
self.length_limit = length_limit

def log(self, message):
if len(message) < self.length_limit:
self.logger.log(message)

The strength of the decorator pattern is evident in its flexibility to combine functionalities such as LevelFiltering and LengthFiltering. This can be achieved by instantiating a LevelFilteredLogger that encapsulates an instance of LengthFiltering, thus avoiding the necessity to define a new class for this combined functionality.

if __name__ == '__main__':
file_logger = FileLogger('log.txt')
file_logger.log('Hello World')

std_out_logger = STDOutLogger()
std_out_logger.log('Hello World')

level_filtered_std_logger = LevelFilteredLogger(std_out_logger, 'INFO')
level_filtered_std_logger.log('INFO: Hello World')
level_filtered_std_logger.log('DEBUG: Hello World')

length_filtered_file_logger = LengthFilteredLogger(file_logger, 10)
length_filtered_file_logger.log('Hello World')
length_filtered_file_logger.log('Hello')

length_and_level_filtered_logger = LengthFilteredLogger(level_filtered_std_logger, 10)
length_and_level_filtered_logger.log('INFO: Hello World Again')
length_and_level_filtered_logger.log('DEBUG: Hello World')
length_and_level_filtered_logger.log('INFO: HI')

# Expected output in log.txt:
# Hello World
# Hello

# Expected output in console:
# Hello World
# INFO: Hello World
# INFO: HI

A Combination of Patterns

To synthesize the various strategies discussed, a comprehensive solution might be structured as follows:

import abc

class Filter:
@abc.abstractmethod
def should_keep(self, message):
pass

class Handler:
@abc.abstractmethod
def write(self, message):
pass

class Logger:
def __init__(self, filters: list[Filter], handlers: list[Handler]):
self.filters = filters
self.handlers = handlers

def log(self, message):
if all(f.should_keep(message) for f in self.filters):
for h in self.handlers:
h.write(message)

class LevelFilter:
def __init__(self, level):
self.level = level
def should_keep(self, message):
return self.level in message

class LengthFilter:
def __init__(self, length_limit):
self.length_limit = length_limit
def should_keep(self, message):
return len(message) < self.length_limit

class FileHandler:
def __init__(self, filename):
self.filename = filename
def write(self, message):
with open(self.filename, 'a') as f:
f.write(message + '\n')

class STDOutHandler:
def write(self, message):
print(message)

if __name__ == '__main__':
file_handler = FileHandler('log.txt')
length_file_logger = Logger([LengthFilter(10)], [file_handler])
length_file_logger.log('Hello World')

std_out_handler = STDOutHandler()
level_std_out_logger = Logger([LevelFilter('INFO')], [std_out_handler])
level_std_out_logger.log('INFO: Hello World')

length_and_level_logger = Logger([LengthFilter(10), LevelFilter('INFO')], [file_handler, std_out_handler])
length_and_level_logger.log('INFO: Hello World')

# Expected output in console:
# INFO: Hello World

Conclusion

The utility of design patterns becomes significantly clearer when contextualized with specific problems and examples. Each pattern offers a distinct approach to resolution, with certain strategies proving more effective under specific circumstances, particularly in relation to constraints like the feasibility of altering the original source code or the initial design. In summary, several key principles underlie these patterns:

  • Aggregation provides a broader range of possibilities compared to inheritance.
  • The establishment of abstractions (interfaces) facilitates easier extension of the codebase.
  • While indirection can offer valuable flexibility, it also has the potential to introduce complexity and confusion.

References

· 3 min read

Motivation

This article reflects on the blog posts, The Prebound Method Pattern and The Sentinel Object Pattern, by Brandon Rhodes. I'll briefly summarize the patterns and discuss my thoughts on them.

The Prebound Method Pattern

This pattern can be observed when using built-in functions such as random and logging. Instead of needing to create a new instance of the class, we can simply call the function directly. This is possible because a default instance is created within the module, and the instance method is assigned to the module's global namespace.

For example, a logger could be created as follows:

class Logger:
def __init__(self, name):
self.name = name

def log(self, message):
print(f"{self.name}: {message}")

_default_logger = Logger("default")
log = _default_logger.log

The log method is assigned to the module's global namespace, allowing it to be called directly. This is a simple example, but it's useful when you want to create and use a default instance of a class without needing to create a new one.

This supports the usage of the logger as follows:

import logger

logger.log("Hello World")

This pattern isn't super common, but it's a neat trick to know about. It feels a bit like the singleton pattern. However, since we don't restrict the number of instances that can be created, it's not truly a singleton.

The Sentinel Object Pattern

This pattern highlights that, despite Python's support for None, we can sometimes provide a more meaningful value to represent a missing value. This is particularly useful when we need to differentiate between a missing value and a valid one.

To illustrate with a similar example from the original article, consider the context of open-source software, where we might want to specify the type of license. We could use None to represent an unspecified license type. However, a valid alternative could be to assign it to a License object that clearly indicates the type of license (e.g., "not specified" or "unlicensed", which may mean different things).

To give a similar example provided in the original article, suppose in the context of open source software, we want to provide a value of the license type. We can use None to represent the case where the license type is not specified. However, a valid alternative could be assigning it to a License object that clearly indicates the type of license (whether it is "not specified" or "unlicensed", they may mean different things).

Here's an example of the pattern in action:

class License:
def __init__(self, name):
self.name = name

def get_name(self):
return self.name

class Package:
def __init__(self, name, license):
self.name = name
self.license = license

# INSTEAD OF
packages = [
Package("dummy1", None),
Package("dummy2", License("BSD")),
]

for package in packages:
if package.license is None:
print("not specified")
else:
print(package.license.get_name())

# WE CAN USE
UNLICENSED = License("unlicensed")
NOT_SPECIFIED = License("not specified")

packages = [
Package("dummy1", UNLICENSED),
Package("dummy2", NOT_SPECIFIED),
Package("dummy3", License("BSD")),
]

for package in packages:
print(package.license.get_name())

The advantage here is replacing None with more explicit values, which documents the intent more clearly. This may also reduce the need for None checks in the code.

Conclusion

The two patterns are subtle but can be quite useful in certain cases. It's good to be aware of them and use them when appropriate. The original articles are also worth reading for more detailed explanations.