The goal of this research is to use the boosting technique with the C4.5 algorithm to lower the number of wrong classifications. One of the techniques for boosting is called Adaboost. It can balance the class by giving more weight to the level of classification error, which can change how the data is spread out. The Online Shoppers Purchasing Intention dataset from the UCI Machine Learning Repository is used to test this method. It has 12330 records and 18 attributes, with a label being one of those attributes. In this study, the results were looked at with the help of software called RapidMiner 9.10. Before the processing stage, the dataset is split into two parts: training and testing, with a 90:10 and 80:20 split, respectively. The results of the experiments with the C4.5 algorithm and the boosting technique at a ratio of 90:10 (accuracy of 95.07 percent and AUC of 0.966) and a ratio of 80:20 (accuracy of 95.07 percent and AUC of 0.966). While the experimental results of the C4.5 algorithm without the boosting technique at a ratio of 90:10 (accuracy of 89.02 percent and AUC of 0.845) and a ratio of 80:20 (accuracy of 88.1 percent and AUC of 0.845). Based on the results of the comparison, boosting on the C4.5 algorithm is much better than the standard C4.5 algorithm, with an average increase of 5.76 percent. We can say that using the boosting technique with the C4.5 algorithm can improve high accuracy and lower the amount of classification error.