This tool is designed for analysis and safe refactoring of the CSS code. Right now, it finds three different types of duplication in CSS files and safely refactores them.
I am developing this tool as an infrastructure for my research on CSS during my PhD studies.
This project is licensed under the MIT License.
We are developing an eclipse plugin for css-analyser.
You will need Java 8 installed on your machine for running this tool.
We use Gradle as build system.
After installing Gradle, run gradle build
in the root of the project.
When the build is finished, the generated standalone jar files will be found in build\distributions
,
inside the zip or tar archives (either of the archives may be used).
For convenience, Gradle generates scripts (inside the bin
folder in the zip and tar archives) for running the tool
under windows (named css-analyzer.bat
) or other operating systems (named css-analyser
).
You can run gradle eclipse
to download the dependencies,
generate the Eclipse project files (including classpath
, .project
, etc).
This tool supports three modes:
- Crawl mode In this mode, tool uses Crawljax
to crawl web pages of the given url(s). Then it analyzes all the CSS files of the cawled web pages,
with respect to the collected DOM states.
For using this mode, use--mode crawl --url "http://to.be.analyzed" --out-folder "path/to/analyzed/info/folder"
. Tool will collect DOM states using Crawljax to the given path using--out-folder
. It also creates a folder calledcss
, in which all the CSS files are saved.
It is also possible to use `--urls-file "path/to/file"`, to provide a list of websites for analysis. Website URLs must be given one per line in this file.
-
Folder mode If a previous data from crawling is available, one may use
--mode folder --in-folder "path/to/crawled/data"
to avoid re-crawling of the web pages. Also, the parameter--foldersfile "path/to/list/of/folders"
could be used to provide a file containing a list of paths of crawled web sites. -
NODOM mode This mode analyzes CSS files without using corresponding DOM states. One must provide a folder, containing CSS files (with
.css
extension) for analysis using--in-folder "path/to/css/files/"
. Safe refactoring is simply NOT possible in this mode.
Using --min-sup
, one may provide minimum support count for FP-Growth, that is, the minimum number of selectors
which have one or more duplicated declarations.
The default value is 2.
css-anlyser supports detecting duplicated declarations in CSS files and abstracting the duplications to mixins in a preprocessor (for now, only Less syntax). The best way to take advantage of this feature is to use the Eclipse plugin.