I'm trying to encrypt a string in Swift 3, and my encryption is giving different output every time. Why is that? (I have tried similar encryption in python, and the encrypted output is always the same).
Here is my Swift 3 aesEncrypt function:
func aesEncrypt(key:String, iv:Array<Any>, options:Int = kCCOptionPKCS7Padding) -> String? {
if let keyData = sha256(string:key),
let data = self.data(using: String.Encoding.utf8),
let cryptData = NSMutableData(length: Int((data.count)) + kCCBlockSizeAES128) {
let keyLength = size_t(kCCKeySizeAES128)
let operation: CCOperation = UInt32(kCCEncrypt)
let algorithm: CCAlgorithm = UInt32(kCCAlgorithmAES128)
let options: CCOptions = UInt32(options)
var numBytesEncrypted :size_t = 0
let cryptStatus = CCCrypt(operation,
algorithm,
options,
(keyData as NSData).bytes, keyLength,
iv,
(data as NSData).bytes, data.count,
cryptData.mutableBytes, cryptData.length,
&numBytesEncrypted)
// ADDED PRINT STATEMENTS
print("keyData")
print(keyData)
print("\(keyData as NSData)")
print("iv")
print(iv)
var hex_iv = toHexString(arr: iv as! [UInt8])
print(hex_iv)
print("data")
print(data)
print("\(data as NSData)")
print("encryption: cryptdata")
print(cryptData)
print("encryption: num bytes encrypted")
print(numBytesEncrypted)
if UInt32(cryptStatus) == UInt32(kCCSuccess) {
cryptData.length = Int(numBytesEncrypted)
let base64cryptString = cryptData.base64EncodedString(options: .lineLength64Characters)
return base64cryptString
}
else {
return nil
}
}
return nil
}
When I try to run the following code with initial_string = "hello", I get different encrypted output strings every time.
let iv [UInt8](repeating: 0, count: 16)
let key = "sample_key"
let initial_string = "hello"
let encryptedString = initial_string.aesEncrypt(key: key, iv: iv)
print("Encrypted string")
print(encryptedString)
sample output from running code with "hello" string first time:
keyData
32 bytes
<d5a78c66 e9b3ed40 b3a92480 c732527f 1a919fdc f68957d2 b7e9218f 6221085d>
iv
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00
data
5 bytes
<68656c6c 6f>
encryption: cryptdata
<b17d67fc 26e3f316 6a2bdfbf 9d387c2d 00000000 00>
encryption: num bytes encrypted
16
Encrypted string
Optional("sX1n/Cbj8xZqK9+/nTh8LQ==")
sample output from running code with "hello" string second time:
keyData
32 bytes
<d5a78c66 e9b3ed40 b3a92480 c732527f 1a919fdc f68957d2 b7e9218f 6221085d>
iv
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00
data
5 bytes
<68656c6c 6f>
encryption: cryptdata
<01b9f69b 45deb31d eda46c2d dc9ad9e8 00000000 00>
encryption: num bytes encrypted
16
Encrypted string
Optional("Abn2m0Xesx3tpGwt3JrZ6A==")
Can you tell me why the output is different every time for the same key, iv, and string? Thanks!
Disclaimer: I can't run the question code. Among other things it is not complete, the extension declaration is missing. Also it seems to be Swift 2 code, this needs to be at least updated to Swift 3.
encryption: cryptdata
<01b9f69b 45deb31d eda46c2d dc9ad9e8 00000000 00>
Is completely wrong, it is even the wrong length Encrypted data will be a multiple of the block size.
With PKCS#7 padding and CBC mode the encrypted result should be: C99A30D8DA44968418E8B66F42790216. See Cyyptomathic AES CALCULATOR. Note the 0b0b0b0b0b0b0b0b0b0b0b is the PKCS#7 padding.
Here is an example in Swift 3, this is not production code, it is missing error handling at a minimum.
func SHA256(string:String) -> Data {
let data = string.data(using:.utf8)!
var hashData = Data(count: Int(CC_SHA256_DIGEST_LENGTH))
_ = hashData.withUnsafeMutableBytes {digestBytes in
data.withUnsafeBytes {messageBytes in
CC_SHA256(messageBytes, CC_LONG(data.count), digestBytes)
}
}
return hashData
}
func aesCBCEncrypt(data:Data, keyData:Data, ivData:Data) -> Data {
let cryptLength = size_t(kCCBlockSizeAES128 + data.count + kCCBlockSizeAES128)
var cryptData = Data(count:cryptLength)
var numBytesEncrypted :size_t = 0
let cryptStatus = cryptData.withUnsafeMutableBytes {cryptBytes in
data.withUnsafeBytes {dataBytes in
keyData.withUnsafeBytes {keyBytes in
ivData.withUnsafeBytes {ivBytes in
CCCrypt(CCOperation(kCCEncrypt),
CCAlgorithm(kCCAlgorithmAES),
CCOptions(kCCOptionPKCS7Padding),
keyBytes, keyData.count,
ivBytes,
dataBytes, data.count,
cryptBytes, cryptLength,
&numBytesEncrypted)
}}}}
cryptData.count = (cryptStatus == kCCSuccess) ? numBytesEncrypted : 0
return cryptData;
}
let keyString = "sample_key"
let keyData = SHA256(string:keyString)
print("keyString: \(keyString)")
print("keyData: \(hexEncode(keyData))")
let clearData = hexDecode("68656c6c6f")
// let keyData = hexDecode("d5a78c66e9b3ed40b3a92480c732527f1a919fdcf68957d2b7e9218f6221085d")
let ivData = hexDecode("00000000000000000000000000000000")
print("clearData: \(hexEncode(clearData))")
print("keyData: \(hexEncode(keyData))")
print("ivData: \(hexEncode(ivData))")
let cryptData = aesCBCEncrypt(data:clearData, keyData:keyData, ivData:ivData)
print("cryptData: \(hexEncode(cryptData))")
Output:
keyString: sample_key
keyData: d5a78c66e9b3ed40b3a92480c732527f1a919fdcf68957d2b7e9218f6221085d
clearData: 68656c6c6f
keyData: d5a78c66e9b3ed40b3a92480c732527f1a919fdcf68957d2b7e9218f6221085d
ivData: 00000000000000000000000000000000
cryptData: c99a30d8da44968418e8b66f42790216
you see that IV? that is initialization vector, that functions as a counter to make you encryption different each time, making it more secure and hard to break. so basically your code is working fine, but to decrypt it correctly, the receiver can not act only by having Key , but needs IV also
Related
I have generated an Encrypted Text is Python using cryptography
from cryptography.fernet import Fernet
message = "my deep dark secret".encode()
f = Fernet(key)
encrypted = f.encrypt(message)
# decrypting
from cryptography.fernet import Fernet
encrypted = b"...encrypted bytes..."
f = Fernet(key)
decrypted = f.decrypt(encrypted)
ENCRYPTION INFO:
KEY: b'3b-Nqg6ry-jrAuDyVjSwEe8wrdyEPQfPuOQNH1q5olE='
ENC_MESSAGE: b'gAAAAABhBRBGKSwa7AluNJYhwWaHrQGwAA8UpMH8Wtw3tEoTD2E_-nbeoAvxbtBpFiC0ZjbVne_ZetFinKSyMjxwWaPRnXVSVqz5QqpUXp6h-34_TL7BaDs='
Now I'm trying to Decrypt it in Swift but to no luck.
So Far I've Tried CryptoSwift with the following:
func testdec(){
let str = "3b-Nqg6ry-jrAuDyVjSwEe8wrdyEPQfPuOQNH1q5olE="
let ba = "gAAAAABhBRBGKSwa7AluNJYhwWaHrQGwAA8UpMH8Wtw3tEoTD2E_-nbeoAvxbtBpFiC0ZjbVne_ZetFinKSyMjxwWaPRnXVSVqz5QqpUXp6h-34_TL7BaDs="
let encodedString = Base64FS.decodeString(str: String(str.utf8))
print(encodedString.count)
let first4 = String(ba.prefix(25))
let start = first4.index(first4.startIndex, offsetBy: 9)
let end = first4.index(first4.endIndex, offsetBy: 0)
let iv = String(first4[start..<end])
let starta = ba.index(ba.startIndex, offsetBy: 25)
let enda = ba.index(ba.endIndex, offsetBy: -32)
let cipher_text = String(ba[starta..<enda])
let cipher_text_bt: [UInt8] = [UInt8](base64: cipher_text)
print(cipher_text)
print(iv)
let cipher_text_bta: [UInt8] = [UInt8](base64: ba)
// print(encodedString.bytes.count)
// let key_bta: [UInt8] = [UInt8](base64: "RgSADaf8w4v9vokuncyzWRbP5hkdhXSETdxIHLDHtKg=")
// let iv_bt: [UInt8] = [UInt8](base64: "7KUDrsPmb28KQqOWv00KXw==")
// let cipher_text_bt: [UInt8] = [UInt8](base64: "gAAAAABhBQ837KUDrsPmb28KQqOWv00KX2KjsP2ar6lHLqIPUKSvF1WHiruquG-tiAEkrCZZbm-lFR9ZwxsqVcXovmQ3Hv6pWw==")
do{
print("A")
let aes = try AES(key: encodedString, blockMode: CBC(iv: iv.bytes), padding: .pkcs7)
print("B")
let cipherTexta = try aes.decrypt(cipher_text_bt)
print(cipherTexta)
}catch{
print(error)
}
}
OUTPUT:
16
WaHrQGwAA8UpMH8Wtw3tEoTD2E_-nbeoAvxbtBpFiC0ZjbVne_ZetFinKSyMjxw
RBGKSwa7AluNJYhw
A
B
invalidData
Any Help would be appreciated
I've managed to get your cipher text decrypted using only Apple provided sources. If you support iOS 13 and up, I suggest you use CryptoKit to verify the HMAC, but for now, I've adopted a full CommonCrypto solution.
First a minor extension to create Data from base64 URL strings.
import Foundation
import CommonCrypto
extension Data {
init?(base64URL base64: String) {
var base64 = base64
.replacingOccurrences(of: "-", with: "+")
.replacingOccurrences(of: "_", with: "/")
if base64.count % 4 != 0 {
base64.append(String(repeating: "=", count: 4 - base64.count % 4))
}
self.init(base64Encoded: base64)
}
}
The decrypt function is a bit obscure, but it supports the very old CommonCrypto syntax. withUnsafeBytes syntax would be cleaner, but this is a quick workaround.
func decrypt(ciphertext: Data, key: Data, iv: Data) -> Data {
var decryptor: CCCryptorRef?
defer {
CCCryptorRelease(decryptor)
}
var key = Array(key)
var iv = Array(iv)
var ciphertext = Array(ciphertext)
CCCryptorCreate(CCOperation(kCCDecrypt), CCAlgorithm(kCCAlgorithmAES), CCOptions(kCCOptionPKCS7Padding), &key, key.count, &iv, &decryptor)
var outputBytes = [UInt8](repeating: 0, count: CCCryptorGetOutputLength(decryptor, ciphertext.count, false))
CCCryptorUpdate(decryptor, &ciphertext, ciphertext.count, &outputBytes, outputBytes.count, nil)
var movedBytes = 0
var finalBytes = [UInt8](repeating: 0, count: CCCryptorGetOutputLength(decryptor, 0, true))
CCCryptorFinal(decryptor, &finalBytes, finalBytes.count, &movedBytes)
return Data(outputBytes + finalBytes[0 ..< movedBytes])
}
Then the HMAC. I suggest you use CryptoKit if you can. This function is of course fixed, there might be ways to make this dynamic. For Fernet however, only SHA256 is supported.
func verifyHMAC(_ mac: Data, authenticating data: Data, using key: Data) -> Bool {
var data = Array(data)
var key = Array(key)
var macOut = [UInt8](repeating: 0, count: Int(CC_SHA256_DIGEST_LENGTH))
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA256), &key, key.count, &data, data.count, &macOut)
return Array(mac) == macOut
}
All of that together comes down to the following code. Note that I do not check the version and/or timestamp, which should be done according to the spec.
let fernetKey = Data(base64URL: "3b-Nqg6ry-jrAuDyVjSwEe8wrdyEPQfPuOQNH1q5olE=")!
let signingKey = fernetKey[0 ..< 16]
let cryptoKey = fernetKey[16 ..< fernetKey.count]
let fernetToken = Data(base64URL: "gAAAAABhBRBGKSwa7AluNJYhwWaHrQGwAA8UpMH8Wtw3tEoTD2E_-nbeoAvxbtBpFiC0ZjbVne_ZetFinKSyMjxwWaPRnXVSVqz5QqpUXp6h-34_TL7BaDs=")!
let version = Data([fernetToken[0]])
let timestamp = fernetToken[1 ..< 9]
let iv = fernetToken[9 ..< 25]
let ciphertext = fernetToken[25 ..< fernetToken.count - 32]
let hmac = fernetToken[fernetToken.count - 32 ..< fernetToken.count]
let plainText = decrypt(ciphertext: ciphertext, key: cryptoKey, iv: iv)
print(plainText, String(data: plainText, encoding: .utf8) ?? "Non utf8")
print(verifyHMAC(hmac, authenticating: version + timestamp + iv + ciphertext, using: signingKey))
I am trying to communicate bi-directionally between my laptop & my arduino via passing a common struct back and forth, but am having trouble unpacking the bytes returned from the Arduino. Using pySerialTransfer, it seems rxBuff returns a list of numbers that I can't figure out how to convert into a byte-string for struct.unpack to render back into the numbers I started with.
At present, the Arduino code simply receives the data and spits it back, no transformations or action on the Arduino side. Here's my Arduino code:
#include "SerialTransfer.h"
SerialTransfer myTransfer;
struct POSITION {
long id;
float azimuth;
float altitude;
} position;
void setup()
{
Serial.begin(9600);
myTransfer.begin(Serial);
}
void loop()
{
if(myTransfer.available())
{
myTransfer.rxObj(position, sizeof(position));
myTransfer.txObj(position, sizeof(position));
myTransfer.sendData(sizeof(position));
}
else if(myTransfer.status < 0)
{
Serial.print("ERROR: ");
if(myTransfer.status == -1)
Serial.println(F("CRC_ERROR"));
else if(myTransfer.status == -2)
Serial.println(F("PAYLOAD_ERROR"));
else if(myTransfer.status == -3)
Serial.println(F("STOP_BYTE_ERROR"));
}
}
And my python code:
import time
import struct
from pySerialTransfer import pySerialTransfer
def StuffObject(txfer_obj, val, format_string, object_byte_size, start_pos=0):
"""Insert an object into pySerialtxfer TX buffer starting at the specified index.
Args:
txfer_obj: txfer - Transfer class instance to communicate over serial
val: value to be inserted into TX buffer
format_string: string used with struct.pack to pack the val
object_byte_size: integer number of bytes of the object to pack
start_pos: index of the last byte of the float in the TX buffer + 1
Returns:
start_pos for next object
"""
val_bytes = struct.pack(format_string, *val)
for index in range(object_byte_size):
txfer_obj.txBuff[index + start_pos] = val_bytes[index]
return object_byte_size + start_pos
if __name__ == '__main__':
try:
link = pySerialTransfer.SerialTransfer('/dev/cu.usbmodem14201', baud=9600)
link.open()
time.sleep(2) # allow some time for the Arduino to completely reset
base = time.time()
while True:
time.sleep(0.2)
sent = (4, 1.2, 2.5)
format_string = 'iff'
format_size = 4+4+4
StuffObject(link, sent, format_string, format_size, start_pos=0)
link.send(format_size)
start_time = time.time()
elapsed_time = 0
while not link.available() and elapsed_time < 2:
if link.status < 0:
print('ERROR: {}'.format(link.status))
else:
print('.', end='')
elapsed_time = time.time()-start_time
print()
response = link.rxBuff[:link.bytesRead]
print(response)
response = struct.unpack(format_string, response)
print('SENT: %s' % str(sent))
print('RCVD: %s' % str(response))
print(' ')
except KeyboardInterrupt:
link.close()
When I run the python code, I get the following error:
Traceback (most recent call last):
File "double_float.py", line 54, in <module>
response = struct.unpack(format_string, response)
TypeError: a bytes-like object is required, not 'list'
But just before the error, I print the response: [4, 0, 0, 0, 154, 153, 153, 63, 0, 0, 32, 64].
How do I convert that list into something that struct.unpack can use? Or how do I unpack that numeric list directly? Thanks so much!
Is this what you are looking for:
import struct
data = [4, 0, 0, 0, 154, 153, 153, 63, 0, 0, 32, 64]
binary_str = bytearray(data)
result = struct.unpack("<lff", binary_str)
print(result)
Output
(4, 1.2000000476837158, 2.5)
Trying to implement these 2 functions from Python to Nodejs:
def encrypt(base64_data):
key = os.urandom(32)
encoded_key = base64.b16encode(key)
iv = ""
iv_vector = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
for i in iv_vector:
iv += chr(i)
ctr = Counter.new(128, initial_value=long(iv.encode("hex"), 16))
cipher = AES.new(key, AES.MODE_CTR, counter=ctr)
encrypted = cipher.encrypt(base64_data)
return encrypted, encoded_key
def decrypt(encrypted_data, orig_key):
key = base64.b16decode(orig_key)
iv = ""
iv_vector = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
for i in iv_vector:
iv += chr(i)
ctr = Counter.new(128, initial_value=long(iv.encode("hex"), 16))
cipher = AES.new(key, AES.MODE_CTR, counter=ctr)
decrypted = cipher.decrypt(encrypted_data)
return decrypted
This decrypt works (when I encrypt in Python, I manage to decrypt in node), but the opposite way fails.
Any idea what is missing in the nodejs encrypt function?
// Not working
export const encrypt = (base64Data: string): {encryptedData: string, decryptionKey: string} => {
const key = randomBytes(32);
const encodedKey = key.toString('hex');
var iv = Buffer.from('00000000000000000000000000000000', 'hex');
var encipher = createCipheriv('aes-256-ctr', key, iv);
const x = Buffer.concat([
encipher.update(base64Data),
encipher.final()
]);
return {encryptedData: x.toString('base64'), decryptionKey: encodedKey};
}
// This works
export const decrypt = (base64Data: string, encodedKey: string) => {
const key = Buffer.from( encodedKey, 'hex')
var iv = Buffer.from('00000000000000000000000000000000', 'hex');
var decipher = createDecipheriv('aes-256-ctr', key, iv);
return Buffer.concat([
decipher.update(Buffer.from(base64Data)),
decipher.final()
]);
}
If the following points are taken into account, the encryption and decryption of both codes run in all possible combinations on my machine.
The Node-code must be changed in 2 places so that encryption and decryption are consistent: In the return-statement of the encrypt-method, '00'+encodedKey must be replaced by ''+encodedKey, otherwise the key for the decryption is one byte too long. In the return-statement of the decrypt-method, Buffer.from(base64Data) must be replaced by Buffer.from(base64Data, 'base64'), since the ciphertext is Base64-encoded.
The ciphertext in Python (returned by encrypt, passed to decrypt) is a byte-array. The ciphertext in Node (returned from encrypt, passed to decrypt) is a Base64-encoded string. Therefore a conversion is necessary here, e.g. in the Python-code.
Node returns the key as a hex-string with lowercase letters, Python needs uppercase letters. Therefore, an appropriate conversion is necessary here, e.g. in tbe Python-code.
I have the below java code to encode a string with a pass key
public static String encrypt(String message, String passkey) throws Exception {
final MessageDigest md = MessageDigest.getInstance("SHA-1");
final byte[] digestOfPassword = md.digest(passkey.getBytes("utf-8"));
final byte[] keyBytes = ( byte[])resizeArray(digestOfPassword, 24);
for (int j = 0, k = 16; j < 8;) {
keyBytes[k++] = keyBytes[j++];
}
final SecretKey key = new SecretKeySpec(keyBytes, "DESede");
final IvParameterSpec iv = new IvParameterSpec(new byte[8]);
final Cipher cipher = Cipher.getInstance("DESede/CBC/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, key, iv);
final byte[] plainTextBytes = message.getBytes("utf-8");
final byte[] cipherText = cipher.doFinal(plainTextBytes);
String encryptedString = Base64.encodeBase64String(cipherText);
return encryptedString;
}
Now I converted the same code into python(Python 2.7), and I tried as below.
def encrypt(message, passkey):
hash_object = hashlib.sha1(passkey.encode("utf-8"))
digested_passkey = hash_object.digest() //hashing
key24 = "{: <24}".format(digested_passkey) // for resizing the byte array to size 24
import pyDes
des = pyDes.des(key24);(at this line I m getting the error "Invalid DES key size. Key must be exactly 8 bytes long".
message = message.encode('utf-8')
message = message + (16 - len(message) % 16) * chr(16 - len(message) % 16) // this is for padding
iv = Random.new().read(AES.block_size)
cipher = AES.new(des, AES.MODE_CBC, iv)
return base64.b64encode(iv + cipher.encrypt(message))
At the line des = pyDes.des(key24), I am getting the error "Invalid DES key size. Key must be exactly 8 bytes long."
The passkey that I sent as parameter is "f!16*hw$sda66"
Can anyone please let me know if there is anything wrong with the line
des = pyDes.des(key24)
I think the reason you are getting this error is because the Class initialisation method is expecting the key to be exactly 8, if it's anything else it raises the error you are seeing, this is the init of the class you are calling from pyDes:
# Initialisation
def __init__(self, key, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL):
# Sanity checking of arguments.
if len(key) != 8:
raise ValueError("Invalid DES key size. Key must be exactly 8 bytes long.")
If you do this for debugging:
def encrypt(message, passkey):
hash_object = hashlib.sha1(passkey.encode("utf-8"))
digested_passkey = hash_object.digest() //hashing
key24 = "{: <24}".format(digested_passkey)
print len(key24)
You will see the length of the key is 24, which is why I think it is not being accepted.
I might be wrong but at a quick glance that looks like the issue.
I have the same issue as this question but unfortunately there was no answer on it.
I have the following objective-c code to encrypt using CCCrypt:
(NSData *)doCrypt:(NSData *)data usingKey:(NSData *)key withInitialVector:(NSData *)iv mode:(int)mode error: (NSError *)error
{
int buffersize = 0;
if(data.length % 16 == 0) { buffersize = data.length + 16; }
else { buffersize = (data.length / 16 + 1) * 16 + 16; }
// int buffersize = (data.length <= 16) ? 16 : data.length;
size_t numBytesEncrypted = 0;
void *buffer = malloc(buffersize * sizeof(uint8_t));
CCCryptorStatus result = CCCrypt(mode, 0x0, 0x1, [key bytes], [key length], [iv bytes], [data bytes], [data length], buffer, buffersize, &numBytesEncrypted);
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted freeWhenDone:YES];
}
I use kCCAlgorithmAES128 with kCCOptionPKCS7Padding as options and call the function with [Cryptor doCrypt:data usingKey:key withInitialVector:nil mode:0x0 error:nil];
Now I would like to decrypt it using python and to do so I have the following code:
def decrypt(self, data, key):
iv = '\x00' * 16
encoder = PKCS7Encoder()
padded_text = encoder.encode(data)
mode = AES.MODE_CBC
cipher = AES.new(key, mode, iv)
decoded = cipher.decrypt(padded_text)
return decoded
The PKCS7Encoder looks like this:
class PKCS7Encoder():
"""
Technique for padding a string as defined in RFC 2315, section 10.3,
note #2
"""
class InvalidBlockSizeError(Exception):
"""Raised for invalid block sizes"""
pass
def __init__(self, block_size=16):
if block_size < 2 or block_size > 255:
raise PKCS7Encoder.InvalidBlockSizeError('The block size must be ' \
'between 2 and 255, inclusive')
self.block_size = block_size
def encode(self, text):
text_length = len(text)
amount_to_pad = self.block_size - (text_length % self.block_size)
if amount_to_pad == 0:
amount_to_pad = self.block_size
pad = chr(amount_to_pad)
return text + pad * amount_to_pad
def decode(self, text):
pad = ord(text[-1])
return text[:-pad]
Yet whenever I call the decrypt() function, it returns garbage. Am I missing something or having a wrong option enabled somewhere?
Example in and output:
NSData *keyData = [[NSData alloc] initWithRandomData:16];
NSLog(#"key: %#", [keyData hex]);
NSString *str = #"abcdefghijklmno";
NSLog(#"str: %#", str);
NSData *encrypted = [Cryptor encrypt:[str dataUsingEncoding:NSUTF8StringEncoding] usingKey:keyData];
NSLog(#"encrypted str: %#", [encrypted hex]);
Gives:
key: 08b6cb24aaec7d0229312195e43ed829
str: a
encrypted str: 52d61265d22a05efee2c8c0c6cd49e9a
And python:
cryptor = Cryptor()
encrypted_hex_string = "52d61265d22a05efee2c8c0c6cd49e9a"
hex_key = "08b6cb24aaec7d0229312195e43ed829"
print cryptor.decrypt(encrypted_hex_string.decode("hex"), hex_key.decode("hex"))
Result:
láz
Which is weird, but if dump the hex I get 610f0f0f0f0f0f0f0f0f0f0f0f0f0f0fb02b09fd58cccf04f042e2c90d6ce17a and 61 = a so I think it just shows wrong.
A bigger input:
key: 08b6cb24aaec7d0229312195e43ed829
str: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
encrypted str: 783fce3eca7ebe60d58b01da3d90105a93bf2d659cfcffc1c2b7f7be7cc0af4016b310551965526ac211f4d6168e3cc5
Result:
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaôNÍ“ƒ˜�Üšw6C%
Here you see that the a's are printed with garbage... so I assume this is a padding error or something like that
The IV is nill at the iOs side and 16x 0's at the Python side (see the code)
Your decryption: aes_decrypt(pkcs7_pad(ciphertext))
Correct decryption: pkcs7_unpad(aes_decrypt(ciphertext))
It has to be done this way, because AES in CBC mode expects plaintexts of a multiple of the block size, but you generally want to encrypt arbitrary plaintexts. Therefore, you need to apply the padding before encryption and remove the padding after decryption.
Keep in mind that a - (b % a) cannot be 0 for any (positive) value of a or b. This means that
if amount_to_pad == 0:
amount_to_pad = self.block_size
is unreachable code and can be removed. Good thing is that a - (b % a) already does what you wanted to do with the if block.
You also should extend the unpad (decode) function to actually check whether every padding byte is the same byte. You should also check that the every padding byte is not zero or larger than the block size.