Site icon Dispatch Today News

The Case for Light Digitalization

By Joey Sarte Salceda

Chair, Institute for Risk and Strategic Studies, Inc.

Many proposals for government digitalization today are ambitious, sophisticated, and well intentioned. They typically envision integrated systems, centralized platforms, and end-to-end automation across agencies and functions. In principle, these objectives are desirable. In practice, however, they assume a level of institutional readiness, process clarity, and operational discipline that most government offices do not yet possess.

This observation is not a critique of technological ambition. It is a recognition of institutional starting points. Many public offices operate with uneven capacity, legacy procedures, and informal work-arounds that have accumulated over time. These arrangements are often imperfect, but they reflect adaptations to constraints rather than deliberate design. When such arrangements are digitized without prior simplification, the result is rarely improvement.

Heavy digitalization requires that processes be well understood, internally consistent, and stable. Rules must be coherent before they can be encoded. Discretion must be clearly defined before it can be constrained. Absent these conditions, digitalization does not reform institutional behavior. It preserves existing weaknesses and amplifies them by embedding them into systems that are harder to adjust once deployed.

For this reason, the Albay Institute for Artificial Intelligence has focused on a different approach, which may be described as light but diffused digitalization. The objective is not to redesign entire systems, but to improve specific tasks where the source of difficulty is well understood and the scope of intervention can be tightly controlled.

 Light, but Diffused

Light digitalization concentrates on discrete points of friction rather than comprehensive workflow redesign. Instead of constructing platforms, it produces tools. Instead of standardizing entire processes, it relieves particular burdens that consume time, attention, or precision. The scope of change is intentionally limited, and the reversibility of the intervention is preserved.

One example is the preparation of codal amendments to complex statutes such as the National Internal Revenue Code. This task is governed by strict formatting conventions, and minor errors can materially affect interpretation, review, or publication. The work is repetitive, cognitively demanding, and prone to error, yet it does not benefit from creativity or discretion once the rules are known.

The Institute developed a simple HTML-based tool that operates offline, requires no accounts, and introduces no cybersecurity risk. The tool formats codal amendments correctly and produces a summary table of amendments and deletions. It does not alter existing legislative procedures, approval chains, or institutional roles. It merely removes a persistent source of technical error and delay.

By automating formatting and documentation in this limited context, the tool improves the clarity and speed with which bills are reviewed and compared. This reduces the cost of scrutiny and lowers barriers to understanding, thereby supporting more informed deliberation without imposing systemic change.

Why Heavy Digitalization Often Fails

Heavy digitalization efforts often falter because they assume that institutional knowledge can be centralized and fully specified in advance. As followers of Hayek have long argued, much of what enables institutions to function is tacit, dispersed, and embedded in local practice rather than in formal rules.

Large digital systems are typically designed at a distance from day-to-day operations. As a result, they encode an incomplete understanding of how work is actually performed. When these systems encounter variation, exceptions, or informal practices, they respond with rigidity rather than adaptation.

In such cases, errors cease to be local and correctable. They become systemic. What was once a manageable work-around becomes a binding constraint. The cost of adjustment rises, and the institution becomes less capable of responding to change.

Process and Method

The Institute’s work begins with problem identification rather than technology selection. Workshops are conducted with offices and with rank-and-file staff to identify where tasks impose disproportionate costs in time, effort, or accuracy. Existing tools and actual staff capabilities are inventoried, and solutions are designed within these constraints.

Where appropriate, tools are developed collaboratively using AI-assisted coding. Staff articulate the logic of the task in plain English, while the AI translates this logic into executable code. Syntax is treated as an implementation detail rather than a prerequisite skill.

In many cases, a functional tool exists by the end of the workshop. In others, the prototype serves as a scaffold for later refinement. In both cases, the logic of the tool remains transparent to its users, and ownership of the solution remains within the office.

Democratizing Digitalization

Recent advances in artificial intelligence have altered the production function of software for small scale applications. English has become a usable interface for specifying logic and behavior in limited contexts. This lowers the threshold for participation in tool creation without eliminating the need for judgment.

As a result, digital capability needs not be centralized in specialized units or external vendors. Agencies can participate directly in the creation of tools that address their own operational needs. This distributes digital capacity in a manner consistent with dispersed institutional knowledge.

Light digitalization therefore proceeds incrementally, respects existing constraints, and preserves adaptability. In environments where institutional capacity varies widely, this approach offers a realistic path to improvement without the risks associated with comprehensive system overhaul. 

Exit mobile version