MINIMIZE THE XOR

Given two arrays of integers A and B of size N and Q respectively.

Find all subsequences of sequence A, then compute sum of all these subsequences individually.
Finally, for each B[i] calculate number of subsequences giving minimum XOR with B[i].

Return a matrix C of size Q x 2 where,

C[i][0] = sum of subsequence giving minimum XOR with B[i],

C[i][1] = number of subsequences giving minimum XOR with B[i].



Input Format

The first argument given is the integer array A.
The second argument given is the integer array B.

Output Format

Return a matrix C of size Q x 2 where,

C[i][0] = sum of subsequence giving minimum XOR with B[i],

C[i][1] = number of subsequences giving minimum XOR with B[i].                                                         

Constraints

1 <= N <= 100
1 <= A[i] <= 1000
1 <= Q <= 200000
1 <= B[i] <= 10^9

For Example

Input 1:
    A = [1, 2, 3]
    B = [3, 5, 7]
Output 1:
    [ [3, 2] 
      [5, 1] 
      [6, 1] ]
Explanation 1:
    All subsequences sum possible = [(1), (2), (3), (3), (4), (5), (6)] 
    1. For 3 Xor with 3 is minimum, there are 2 subsequences giving sum 3
    2. For 5 Xor with 5 is minimum, there is only 1 subsequences giving sum 5
    3. For 7 Xor with 6 is minimum, there is only 1 subsequences giving sum 6
NOTE: You only need to implement the given function. Do not read input, instead use the arguments to the function. Do not print the output, instead return values as specified. Still have a doubt? Checkout Sample Codes for more details.
Start solving MINIMIZE THE XOR on Interview Code Editor
Sign Up
to access hints and editorial solutions for MINIMIZE THE XOR

Discussion


Loading...
Click here to start solving coding interview questions