1

在我的项目中有 40 到 50 个 jar 文件可用,每次都需要花费大量时间来找出每个 jar 的最新版本。你能帮我写一个java程序吗?

4

2 回答 2

3

您可能只想使用 maven:http ://maven.apache.org/ 或其他依赖项管理器,例如 Ivy。

于 2012-12-04T12:23:38.513 回答
1
At the time of ant-build please call this method
public void ExpungeDuplicates(String filePath) {
    Map<String,Integer> replaceJarsMap = null;
    File folder = null;
    File[] listOfFiles = null;
    List<String> jarList = new ArrayList<String>();
    String files = "";
    File deleteFile = null;
    Iterator<String> mapItr = null;
    //String extension ="jar";
    try {
            folder = new File(filePath);
        listOfFiles = folder.listFiles();
        for (int i = 0; i < listOfFiles.length; i++) {
            if (listOfFiles[i].isFile()) {
                files = listOfFiles[i].getName();
                jarList.add(files);
            }
        }
        if (jarList.size() > 0) {
            replaceJarsMap = PatternClassifier.findDuplicatesOrLowerVersion(jarList);  
            System.err.println("Duplicate / Lower Version - Total Count : "+replaceJarsMap.size());
            mapItr = replaceJarsMap.keySet().iterator();
            while (mapItr.hasNext()) {
                String key = mapItr.next();
                int repeat = replaceJarsMap.get(key);
                System.out.println( key +" : "+repeat);
                for (int i = 0; i <repeat; i++) {
                    deleteFile = new File(filePath + System.getProperty ("file.separator")+key);
                    try{
                    if (deleteFile != null && deleteFile.exists()){
                        if(deleteFile.delete()){
                            System.err.println(key +" deleted");
                        }
                    }
                    }catch (Exception e) {
                    }
                }
            }
        }

} catch (Exception e) {
    // TODO: handle exception
}
}

You only need to give the path of your Lib to this function.This method will find all the duplicate or lower version of of file. And the crucial function is given below...Which finds out the duplicates from the list of files you provided.

public static Map<String,Integer> findDuplicatesOrLowerVersion(List<String> fileNameList) {
    List<String> oldJarList = new ArrayList<String>();
    String cmprTemp[] = null;
    boolean match = false;
    String regex = "",regexFileType = "",verInfo1 = "",verInfo2 = "",compareName = "",tempCompareName = "",tempJarName ="";
    Map<String,Integer> duplicateEntryMap = new HashMap<String, Integer>();
    int count = 0;
    Collections.sort(fileNameList, Collections.reverseOrder());
    try{
        int size = fileNameList.size();
        for(int i = 0;i<size;i++){
            cmprTemp = fileNameList.get(i).split("[0-9\\._]*");
            for(String s : cmprTemp){
                compareName += s;
            }
            regex = "^"+compareName+"[ajr0-9_\\-\\.]*";
            regexFileType = "[0-9a-zA-Z\\-\\._]*\\.jar$"; 
            if( fileNameList.get(i).matches(regexFileType) && !oldJarList.contains(fileNameList.get(i))){
                for(int j = i+1 ;j<size;j++){
                    cmprTemp = fileNameList.get(j).split("[0-9\\._]*");
                    for(String s : cmprTemp){
                        tempCompareName += s;
                    }
                    match = (fileNameList.get(j).matches(regexFileType) && tempCompareName.matches(regex));
                    if(match){
                            cmprTemp = fileNameList.get(i).split("[a-zA-Z\\-\\._]*");
                            for(String s : cmprTemp){
                                verInfo1 += s;
                            }
                            verInfo1 += "000";
                            cmprTemp = fileNameList.get(j).split("[a-zA-Z\\-\\._]*");
                            for(String s : cmprTemp){
                                verInfo2 += s;
                            }
                            verInfo2 += "000";
                            int length = 0;
                            if(verInfo1.length()>verInfo2.length()){
                                length = verInfo2.length();                         
                        }else{
                            length = verInfo1.length();
                        }
                        if(Long.parseLong(verInfo1.substring(0,length))>=Long.parseLong(verInfo2.substring(0,length))){
                            count = 0;
                            if(!oldJarList.contains(fileNameList.get(j))){
                                oldJarList.add(fileNameList.get(j));
                                duplicateEntryMap.put(fileNameList.get(j),++count);
                            }else{
                                count = duplicateEntryMap.get(fileNameList.get(j));
                                duplicateEntryMap.put(fileNameList.get(j),++count);
                            }
                        }else{
                            tempJarName = fileNameList.get(i);
                        }
                        match = false;verInfo1 = "";verInfo2 = "";
                }
                    tempCompareName = "";
                }
                if(tempJarName!=null && !tempJarName.equals("")){
                    count = 0;
                    if(!oldJarList.contains(fileNameList.get(i))){
                        oldJarList.add(fileNameList.get(i));
                        duplicateEntryMap.put(fileNameList.get(i),++count);
                    }else{
                        count = dupl    icateEntryMap.get(fileNameList.get(i));
                        duplicateEntryMap.put(fileNameList.get(i),++count);
                    }
                    tempJarName = "";
                }
            }
            compareName = "";
        }
    }catch (Exception e) {
        e.printStackTrace();
    }
    return duplicateEntryMap;
}    

What findDuplicatesOrLowerVersion(List fileNameList) function task - Simply it found the duplicates and passting a map which contains the name of the file and number of time the lower version repeats.

Try this. The remaining file exist in the folder should be latest or files with out duplicates.Am using this for finding the oldest files.on the basis of that it will find the old and delete it.
This am only checking the name..Futher improvement you can made.

Where PatternClassifier is a class which contains the second method given here.
于 2013-02-16T13:47:46.407 回答