mardi 25 juillet 2023

Java DecimalFormat pattern, differences between # and 0

I have the following code:

public class Main {
    public static void main(String[] args) {
        String pattern = ".00";         // a) => |.46|
//        String pattern = ".##";       // b) => |.46|
//        String pattern = ".0#";       // c) => |.46|
//        String pattern = "#.00";      // d) => |.46|
//        String pattern = "#.0#";      // e) => |.46|
//        String pattern = "#.##";      // f) => |0.46|
//        String pattern = ".#0";       // g) => IllegalArgumentException: Malformed pattern ".#0"

    double value = 0.456;
    DecimalFormat df = (DecimalFormat) NumberFormat.getNumberInstance(Locale.US);
    df.applyPattern(pattern);
    String output = df.format(value);
    System.out.printf("|%s|", output);
   }  
}

Could someone explain that why in f) the zero before decimal point still displays while d) and e) don't even though they are all have a '#' before decimal point in the pattern? Also, what could be the reason for the exception in g) as it seems to me a valid pattern?

Aucun commentaire:

Enregistrer un commentaire