Home Development of Websites Generating auxiliary files: re-export, export object, validators from models – can we be friends with Webpack?

Generating auxiliary files: re-export, export object, validators from models – can we be friends with Webpack?

by admin

When developing a SPA, quite a bit of time is spent working with importing and exporting various files, as well as creating validation schemes. These tasks are easy enough to automate, but, as is usually the case, "there are nuances" – let’s try to understand.

I will base this project on the code from this article Since I design several texts as a more or less connected loop.

About re-export and main component files

ES6 file import syntax lets you do it like this (of course, named destructible imports are preferable):

import { ComponentOne }from 'someFolder/ComponentOne/ComponentOne.tsx';import { ComponentTwo } from 'someFolder/ComponentTwo/ComponentTwo.tsx';

This code has two rather annoying disadvantages – the need to write the path all the way down to the file (you get semantic dubbing in this case and intrusion into the internal implementation, namely the structure and naming of files within the component) and the need to write 2 separate imports, which leads to a bloated file. The first is solved by pointing to the main component file with the two dominant approaches :

  • create in each component folder an index.ts file with the content export * from './ComponentOne.tsx'; Disadvantages – the file goes through all stages of compilation, increasing build time, is included in the bundle, increasing the size, and creates additional load on the compiler (Typescript in this case). Also, a quick switch to the IDE usually leads to this index file, and you have to "fall"further, and the list of open files accumulates the same name and useless for development index.ts. Sometimes re-exports of files other than the main one are written to the same file, but this only leads to confusion with another file type, which I will talk about next.
  • creating a package.json file in each component folder with the contents { "main": "ComponentOne.tsx", "types": "ComponentOne.tsx" } This is an explicitly technical file that never actually opens and does not interfere with the target development, while still allowing semantic file names to be maintained. The only drawback is that watch-mode Webpack can’t "on-the-fly" pick up this file when added to a folder, requiring a manual restart of the build.

Whichever way you choose, the imports are reduced to the following :

import { ComponentOne } from 'someFolder/ComponentOne';import { ComponentTwo } from 'someFolder/ComponentTwo';

The next step is to create a re-export file. In the someFolder you also create the component’s master file (which this time is a re-export file) and a file pointing to it according to the scheme above. For this type of files I prefer to choose names according to the scheme _someFolder.ts – the underscore simultaneously lets it always be at the top of the list and semantically separates it from other files (like several dozen utility files in the utils folder) or folders, indicating its special purpose and something you shouldn’t touch with your hands (javascript-developers have long had the habit of "about-private", "technical" functions are named starting with an underscore sign, so that the "habitualness" is also added to the arguments). The content in this case would be as follows :

export * from './ComponentOne';export * from './ComponentTwo';

Thus, the initial imports can be reduced to

import { ComponentOne, ComponentTwo } from 'someFolder';

You should be a little more careful with these imports – if you use them inside components like ComponentOne themselves, there will be a cyclic dependency, which will lead to implicit bugs, so inside the folder mutual imports should be either relative or in an unreduced scheme.

Generation of re-export files

Since editing and synchronizing such files with the contents of the folder quickly gets boring, you can create utilities for generating. This will look like this :


import fs from 'fs';import path from 'path';import chalk from 'chalk';import { ESLint } from 'eslint';import { env } from '../../env';import { paths } from '../../paths';const eslint = new ESLint({fix: true, extensions: ['.js', '.ts', '.tsx'], overrideConfigFile: path.resolve(paths.rootPath, 'eslint.config.js'), });const logsPrefix = chalk.blue(`[WEBPACK]`);const pathsForExportFiles = [{folderPath: path.resolve(paths.sourcePath, 'const'), exportDefault: false, }, {folderPath: path.resolve(paths.validatorsPath, 'api'), exportDefault: true, }, ];type TypeProcessParams = { changedFiles?: string[] };class GenerateFiles {_saveFile(params: { content?: string; filePath: string; noEslint?: boolean }) {const { content, filePath, noEslint } = params;if (content == null) return false;const oldFileContent = fs.existsSync(filePath) ? fs.readFileSync(filePath, 'utf-8') : '';return Promise.resolve().then(() => (noEslint ? content : this._formatTextWithEslint(content))).then(formattedNewContent => {if (oldFileContent === formattedNewContent) return false;return fs.promises.writeFile(filePath, formattedNewContent, 'utf8').then(() => {if (env.LOGS_GENERATE_FILES) console.log(`${logsPrefix} Changed: ${filePath}`);return true;});});}_excludeFileNames(filesNames, skipFiles?: string[]) {const skipFilesArray = ['package.json'].concat(skipFiles || []);return filesNames.filter(fileName => !skipFilesArray.some(testStr => fileName.includes(testStr)));}_formatTextWithEslint(str: string) {return eslint.lintText(str).then(data => data[0].output || str);}generateExportFiles({ changedFiles }: TypeProcessParams) {const config =changedFiles == null? pathsForExportFiles: pathsForExportFiles.filter(({ folderPath }) =>changedFiles.some(filePath => filePath.includes(folderPath)));if (config.length === 0) return false;return Promise.all(config.map(({ folderPath, exportDefault }) => {const { base: folderName } = path.parse(folderPath);const generatedFileName = `_${folderName}.ts`;const generatedFilePath = path.resolve(folderPath, generatedFileName);return Promise.resolve().then(() => fs.promises.readdir(folderPath)).then(filesNames =>this._excludeFileNames(filesNames, [generatedFileName])).then(filesNames => filesNames.reduce((template, fileName) => {const { name: fileNameNoExt } = path.parse(fileName);return exportDefault? `${template}export { default as ${fileNameNoExt} } from './${fileNameNoExt}';n`: `${template}export * from './${fileNameNoExt}';n`;}, '// This file is auto-generatednn')).then(content =>this._saveFile({content, filePath: generatedFilePath, noEslint: true, }));})).then(filesSavedMarks => filesSavedMarks.some(Boolean));}process({ changedFiles }: TypeProcessParams) {const startTime = Date.now();const isFirstGeneration = changedFiles == null;let filesChanged = false;// Order mattersreturn Promise.resolve().then(() => this.generateExportFiles({ changedFiles })).then(changedMark => changedMark (filesChanged = true)).then(() => {if (isFirstGeneration || filesChanged) {const endTime = Date.now();console.log('%s Finished generating files within %s seconds', logsPrefix, chalk.blue(String((endTime - startTime) / 1000)));}return filesChanged;});}}export const generateFiles = new GenerateFiles();

The scheme is as follows: generateFiles.process({ changedFiles: null }), in which you can either pass null (in which case all files will be regenerated) or an array of paths of changed files – and then it will create re-export files only in folders with changed files. It + package.json will be excluded from the list of files inside the re-exported one, in the basic case this is enough. I also made support for not only "common" re-exports as export * from './MyComponent' But also only the default export in the form of named – export { default as MyComponent } from './MyComponent' , this will come in handy next.

The next step is to format the file according to current ESLint and Prettier settings – they have node.js interface for that, but it’s pretty time consuming – on my machine it takes about 0.1 second, which is pretty significant by my standards, so I reimplemented the formatting manually, putting line break characters – that way generation time of a dozen files decreased to thousandths of a second. I’d also like to point out that if the content of the file hasn’t changed, I don’t recreate the file, as this would cause the watchers to trigger (e.g., rebuild Webpack).

Generation of export objects

Sometimes it’s more convenient to generate one named export instead of listing all files, especially if you need specific handling of names. I use this scheme to work with assets, so the code looks like this :

// icons.tsxexport const icons = {arrowLeft: require('./icons/arrow-left.svg'), arrowRight: require('./icons/arrow-right.svg'), };// MyComponent.tsximport { icons } from 'assets/icons';import { icons } from 'assets'; // Or abbreviated with a re-export file<img src={icons.arrowLeft} />

Such a scheme is more convenient than import { arrowLeft } from 'assets/icons'; because in this case it’s better not to destruct the names not to confuse them with other entities on the page. In this way, a method similar to the previous one will be added to the generator file :


const pathsForAssetsExportFiles = [{folderPath: path.resolve(paths.assetsPath, 'icons'), exportDefault: false, }, {folderPath: path.resolve(paths.assetsPath, 'images'), exportDefault: true, }, ];class GenerateFiles {generateAssetsExportFiles({ changedFiles }: TypeProcessParams) {const config =changedFiles == null? pathsForAssetsExportFiles: pathsForAssetsExportFiles.filter(({ folderPath }) =>changedFiles.some(filePath => filePath.includes(folderPath)));if (config.length === 0) return false;return Promise.all(config.map(({ folderPath, exportDefault }) => {const { base: folderName, dir: parentPath } = path.parse(folderPath);const generatedFileName = `${folderName}.ts`;const generatedFilePath = path.resolve(parentPath, generatedFileName);return Promise.resolve().then(() => fs.promises.readdir(folderPath)).then(filesNames => {const exportObject = this._createExportObjectFromFilesArray({folderName, filesNames, exportDefault, });return `// This file is auto-generatednnexport const ${folderName} = ${this._objToString(exportObject)}`;}).then(content =>this._saveFile({content, filePath: generatedFilePath, noEslint: true, }));})).then(filesSavedMarks => filesSavedMarks.some(Boolean));}}

In this case, the difference is that require syntax is used and the generated file does not have a prefix _, because this file creates a new entity as an object with converted keys, respectively is not technical, and it is quite possible to "fall" to study the set of possible values. And the fact that it is auto-generated is simply underlined by a comment at the very top of the file.

I think those who have worked with Webpack have come up with the idea – why not use require.context To "on the fly" to collect such files without separate generation. The answer is simple – TS, IDE and the developer can’t know what this very require.context , accordingly there will be no hints, autocompletes, shortcuts, type checks, etc., so I don’t recommend this option to consider.

Generating validators from Typescript models

As I like to say – the task is trivial, read the file by ts-compiler and replace types with functions of some library to check values. In the end I would like to write the model once and get the file with validation function in a separate folder. This one will do the job small utility but it takes only 1 file at a time as a parameter, creating a new compiler instance every time – so it will work for a dozen files for a few seconds. The compiler can work with arrays of files, so to work effectively you need to fork this utility and replace the compilation method with the following :

public static compile(filePaths:string[], options: ICompilerOptions = {ignoreGenerics: false, ignoreIndexSignature: false, inlineImports: false, }) {const createProgramOptions = { target: ts.ScriptTarget.Latest, module: ts.ModuleKind.CommonJS };const program = ts.createProgram(filePaths, createProgramOptions);const checker = program.getTypeChecker();return filePaths.map(filePath => {const topNode = program.getSourceFile(filePath);if (!topNode) {throw new Error(`Can't process ${filePath}: ${collectDiagnostics(program)}`);}const content = new Compiler(checker, options, topNode).compileNode(topNode);return { filePath, content };});}

The next step is to create an example file, for example to query the api (I separated types for clarity):


type ApiRoute = {url: string;name: string;method: 'GET' | 'POST';headers?: any;};type RequestParams = {email: string;password: string;};type ResponseParams = {email: string;sessionExpires: number;};type AuthApiRoute = ApiRoute { params?: RequestParams; response?: ResponseParams };export const auth: AuthApiRoute = {url: `/auth`, name: `auth`, method: 'POST', };

And put an omnivorous and therefore fattening file generator on it :


import { Compiler }from '../../lib/ts-interface-builder';const pathsForValidationFiles = [{folderPath: path.resolve(paths.sourcePath, 'api'), }, ];const modelsPath = path.resolve(paths.sourcePath, 'models');class GenerateFiles {generateValidationFiles({ changedFiles }: TypeProcessParams) {const config =changedFiles == null? pathsForValidationFiles: pathsForValidationFiles.filter(({ folderPath })=>changedFiles.some(filePath => filePath.includes(folderPath) || filePath.includes(modelsPath)));if (config.length === 0) return false;return Promise.all(config.map(({ folderPath }) => {const { base: folderName }= path.parse(folderPath);const generatedFileName = `_${folderName}.ts`;const generatedFolderPath = path.resolve(paths.validatorsPath, folderName);if (!fs.existsSync(generatedFolderPath)) fs.mkdirSync(generatedFolderPath);return Promise.resolve().then(() => fs.promises.readdir(folderPath)).then(filesNames => this._excludeFileNames(filesNames, [generatedFileName])).then(filesNames => filesNames.map(fileName => path.resolve(folderPath, fileName))).then(filesPaths =>Promise.all(Compiler.compile(filesPaths, { inlineImports: true }).map(({ filePath, content }) => {const { base: fileName }= path.parse(filePath);const generatedFilePath = path.resolve(generatedFolderPath, fileName);return this._saveFile({ filePath: generatedFilePath, content });})));})).then(filesSavedMarks => _.flatten(filesSavedMarks).some(Boolean));}}

One thing to pay attention to here is that you cannot format such files manually, so you have to run them through eslint with the corresponding time-consuming. The parameters of the utility are { inlineImports: true } so that type imports could be included directly in the output file (otherwise they would not be checked), and also the check filePath.includes(modelsPath) so that when models change, this processing is triggered. Pulling out the dependency tree is not a trivial task, so this functionality is supposed to be maintained manually in my proposed version.

Thus, running this method will generate a :


import * as t from 'ts-interface-checker';// tslint:disable:object-literal-key-quotesexport const ApiRoute = t.iface([], {url: 'string', name: 'string', method: t.union(t.lit('GET'), t.lit('POST')), headers: t.opt('any'), });export const RequestParams = t.iface([], {email: 'string', password: 'string', });export const ResponseParams = t.iface([], {email: 'string', sessionExpires: 'number', });export const AuthApiRoute = t.intersection('ApiRoute', t.iface([], {params: t.opt('RequestParams'), response: t.opt('ResponseParams'), }));const exportedTypeSuite: t.ITypeSuite = {ApiRoute, RequestParams, ResponseParams, AuthApiRoute, };export default exportedTypeSuite;

which can be run, for example, as follows :

import { createCheckers } from 'ts-interface-checker';import * as apiValidatorsTypes from 'validators/api';const apiValidators = _.mapValues(apiValidatorsTypes, value => createCheckers(value));function validateRequestParams({ route, params }) {return Promise.resolve().then(() => {const requestValidator = _.get(apiValidators, `${[route.name]}.TypeRequestParams`);return requestValidator.strictCheck(params);}).catch(error => {throw createError(errorsNames.VALIDATION, `request: (request params) ${error.message} for route "${route.name}"`);});}

This way, a human-understandable error will be generated when sending a query if the types don’t match. Unfortunately, the ts-interface-checker library doesn’t work with generics, but for my needs the current functionality in the form of deep nested object checking fits perfectly.

Integration with the assembly process

Before the build, running all the generated generation recipes is simple, just run an additional script (in my case it already has one – webpackBuider.ts), which runs the step generateFiles.process({}) (with empty changedFiles – it means that there will be a large-scale operation to create files and overwrite them with new ones, if they have changed). But in the case of reassembly, the most interesting part begins, because we will have to interact with Webpack – "thing-in-itself" – in two possible scenarios :

  • Integration in compiler.hooks.watchRun :


import webpack from 'webpack';import { generateFiles } from '../utils/generateFiles';class ChangedFiles {apply(compiler: webpack.Compiler) {compiler.hooks.watchRun.tapAsync('GenerateFiles_WatchRun', (comp, done) => {const watcher = comp.watchFileSystem.watcher || comp.watchFileSystem.wfs.watcher;const changedFiles = Object.keys(watcher.mtimes);return changedFiles.length? generateFiles.process({ changedFiles }).then(() => done()): done();});}}export const pluginChangedFiles: webpack.Plugin = new ChangedFiles();

Thus, there will be an asynchronous action before the reassembly starts, and a continuation of the reassembly process colbecue at the end of the reassembly. In case of parallel isomorphic builds this solution used in one of the builds won’t work of course, since it has to be executed before both of them, taking into account the modified files of both of them. To do this, I had to fork parallel-webpack and set up communication between the three processes via node-ipc, blocking neighboring assemblies until the general generation was finished. I won’t dwell on this in detail, you can look at the final repository if you’re interested, but this solution in general has a lot of drawbacks :

  • generated files don’t get into the current build cycle, despite the asynchronous hook. Thus, if the first build has generated a _const.ts file, the second build will be run after it completes, as Webpack watcher will consider it a new change.
  • when deleting any file that was referenced in the re-export file, the watchRun hook does not get to the process, but crashes in advance with an error – I have to restart.
  • When adding a new file to the folder, of course, it is not pulled up, because it is not in the scope of Webpack.

I was able to solve the first two problems in this case with the following plugin :

new webpack.ProgressPlugin(percentage => {if (percentage === 0) generateFilesSync.process({ changedFiles });})

However, it only works synchronously, which is impossible to observe due to the asynchronous nature of some recipes. All in all, after thoroughly examining the Webpack source code, I was never able to connect to its observing system, blocking rebuilds and adding modified files to it.

  • Separate process with observation of changed files :


function startFileWatcher() {let changedFiles = [];let isGenerating = false;let watchDebounceTimeout = null;watch(paths.sourcePath, { recursive: true }, function fileChanged(event, filePath) {if (filePath) changedFiles.push(filePath);if (isGenerating) return false;clearTimeout(watchDebounceTimeout);watchDebounceTimeout = setTimeout(() => {isGenerating = true;generateFiles.process({ changedFiles }).then(() => {isGenerating = false;if (changedFiles.length > 0) fileChanged(null, null);});changedFiles = [];}, 10);});}function afterFirstBuild() {startFileWatcher();}

In this case it just starts tracking an entire folder or multiple folders, respectively files are created independently of Webpack, which means deletion and creation of files is handled correctly. In Webpack config there is parameter aggregateTimeout (which in my version is set via env-parameters), this is a delay from file change time to start rebuild. Accordingly, if generation of the files took less time than this value, only 1 rebuild would be started with the resulting files, which is exactly the result we want to get. However, this value is a constant, and files can sometimes be generated in more time, which will result in two rebuilds instead of one.

Unfortunately, I couldn’t find a perfect one, but since I am trying to optimize reassemblies as much as possible, in this case I ignore the rare double ones.

I am using this generator not only to simplify those routine tasks I mentioned above, but also as part of a larger architectural solution (microfrontends, don’t laugh), hopefully I will get to this topic in a few articles.


Comfortable coding!

You may also like